Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

So I'm trying to find out the performance hit an eGPU set up would take compared to a desktop.

This all depends on what kind of setup you would be using. If you're able to negotiate an x1 2.0 connection using a nVidia card with optimus, it looks like most people with this configuration are getting ~60-75% or more of the performance of the card. Using higher end video cards typically still results in better performance over using lower end cards but there is some overlap. If you check out the first page of the thread, there is a leaderboard with results from benchmarking. If you're not able to use the optimus configuration or if you can only negotiate a 1.0 connection then your performance will degrade further.

Just to give you a comparison:

Standard Desktop PC (Closest I could find on the 3DMark site with similar processor speed):

Intel Core i7-3770K Processor (3,500 MHz)

GTX 780

3DMark06 Score: 33133

Laptop With EGPU (taken from the first page)

i7-3720QM 2.6 (3,600 MHz)

[email protected] (This is x1 2.0 with optimus configuration)

3DMark06 Score: 25860

Based on just the above, it shows a performance hit of about 28% (if my math is right, otherwise it's 22% @_@) by using the same video card in an eGPU setup with a similar speed processor, but still, that is one of the high-end cards right now and it's still performing very well.

What does that 28% mean to you? If you're able to reach 60FPS with your current video card in a game right now hooked up to a 3.6GHz desktop processor and use that same video card in an eGPU setup on a laptop with a similar 3.6GHz processor, you'd be losing about 17FPS which means you would be down to 43FPS, which is still playable.

Just to throw some wrenches into the gears, it really depends on the game itself how well it would perform with limited bandwidth available. Different amounts of data is transferred at different times based on how the game was programmed, on what is being displayed, and the settings you're using within the game. While we can estimate performance, there's no guarantee that your game will still run flawlessly all the time with slight loss to FPS - you could get stuttering and other strange effects.

What I think I get so far, is that output to an external screen gives better results than the LCD. All I'm looking for is optimal performance for an external output. I'm sorry if I phrased the entire post wrongly, I'm still trying to wrap my head around the whole eGPU thing.

This is true. By using the internal display, you're consuming some of the bandwidth the video card would be using to transfer the video data back to the laptop for display. Right now you'll always get better performance using an external display that is connected to the video card.

  • Thumbs Up 3
Link to comment
Share on other sites

Hi nando,

Using Your latest Setup 1.30 test2, my GTX660 is successfully found and switched to Gen2, but I'm getting "no solution found" when I run compact.

DELL E6430, Win 7 32-bit. I deleted the incorrect "PCI bus" section in devcon.txt file.

Please see attachment and help please...

post-19577-14494996402257_thumb.jpg

post-19577-14494996402045_thumb.jpg

Link to comment
Share on other sites

Hi nando,

Using Your latest Setup 1.30 test2, my GTX660 is successfully found and switched to Gen2, but I'm getting "no solution found" when I run compact.

DELL E6430, Win 7 32-bit. I deleted the incorrect "PCI bus" section in devcon.txt file.

Please see attachment and help please...

[ATTACH=CONFIG]9560[/ATTACH][ATTACH=CONFIG]9561[/ATTACH]

Your TOLUD=3.5GB. If a 32-bit ALL compaction doesn't fund a solution then you need to do a DSDT override.

Note: My E6230 decreased TOLUD it's TOLUD from 3.5GB to 3.25GB upon detecting my eGPU at boot (REF: http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660@[email protected] ). If you can get yours to do the same then you won't need the DSDT override.

Link to comment
Share on other sites

Thank You for Your answer.

1) I tried the method with connecting eGPU before booting, but sadly it doesn't work - the computer won't wake up, no backlit of the screen (jumper SW2 set to 2-3, SW1 to 1).

I also tried to set the SW2 to 1-2 and SW1 to 3 and then to run the DELL - and OS starts, but doesn't changed the value DFA to CFA in Device Manager, what means eGPU probably isn't detected at boot :(

2) I've done DSDT override earlier, there was error when iasl tried to generate .aml file so I used a newer version of iasl that ignored this error and generated .aml file - I loaded this file and after restart in Device Manager I see something like a "big memory." - please see screenshot:

post-19577-14494996402423_thumb.jpg

Should I do DSDT override again with older iasl and without any errors or is there other solution maybe?

EDIT: Added screenshot with error when trying to do: iasl dsdt_CBX3_EGPU.dsl

post-19577-14494996402661_thumb.jpg

Link to comment
Share on other sites

Thank You for Your answer.

1) I tried the method with connecting eGPU before booting, but sadly it doesn't work - the computer won't wake up, no backlit of the screen (jumper SW2 set to 2-3, SW1 to 1).

I also tried to set the SW2 to 1-2 and SW1 to 3 and then to run the DELL - and OS starts, but doesn't changed the value DFA to CFA in Device Manager, what means eGPU probably isn't detected at boot :(

2) I've done DSDT override earlier, there was error when iasl tried to generate .aml file so I used a newer version of iasl that ignored this error and generated .aml file - I loaded this file and after restart in Device Manager I see something like a "big memory." - please see screenshot:

[ATTACH=CONFIG]9564[/ATTACH]

Should I do DSDT override again with older iasl and without any errors or is there other solution maybe?

EDIT: Added screenshot with error when trying to do: iasl dsdt_CBX3_EGPU.dsl

[ATTACH=CONFIG]9565[/ATTACH]

Unfortunately you have no large memory entry in the device manager output. Please redo your DSDT override. Linked off that page are helpful hints by angterthosenear and kizwan on getting a successful DSDT compilation.

Link to comment
Share on other sites

OK. Here's my step-by-step DSDT override - trying to do while eGPU disconnected:

1. Command iasl.exe -g

post-19577-14494996402954_thumb.jpg

2. The File that has been created: dsdt_CBX3.zip

3. I found in that file line:

Device (PCI0)
{
Name (_HID, EisaId ("PNP0A08"))
Name (_CID, EisaId ("PNP0A03"))

Under the last DWordMemory entry in that area (line 2272) I have 'QWordMemory' and I've added additional 'QWordMemory' that was provided by instruction (shown on my screenshot as red buckle) and this how it looks like now:

post-19577-14494996403099_thumb.jpg

4. I've made an .aml file without any error by command iasl dsdt_CBX3.dsl (download an .aml file here: DSDT.zip )

5. Loaded table into registry as shown below:

post-19577-1449499640352_thumb.jpg

What I have to do now?

Link to comment
Share on other sites

OK. Here's my step-by-step DSDT override - trying to do while eGPU disconnected:

1. Command iasl.exe -g

[ATTACH=CONFIG]9566[/ATTACH]

2. The File that has been created: [ATTACH]9573[/ATTACH]

3. I found in that file line:

Device (PCI0)
{
Name (_HID, EisaId ("PNP0A08"))
Name (_CID, EisaId ("PNP0A03"))

Under the last DWordMemory entry in that area (line 2272) I have 'QWordMemory' and I've added additional 'QWordMemory' that was provided by instruction (shown on my screenshot as red buckle) and this how it looks like now:

[ATTACH=CONFIG]9570[/ATTACH]

4. I've made an .aml file without any error by command iasl dsdt_CBX3.dsl (download an .aml file here: [ATTACH]9572[/ATTACH] )

5. Loaded table into registry as shown below:

[ATTACH=CONFIG]9574[/ATTACH]

What I have to do now?

What to do now? Per the DSDT override instructions, confirm you have a "Large Memory" item appearing in Device Manager. If so, then set PCI compaction->endpoint to 56.25GB (36-bit) and perform a eGPU only PCI compaction, followed by Chainloader->Test Run. That will allocate the eGPU into 36-bit PCI space eradicating the error12 against it.

Link to comment
Share on other sites

I tried to do that way + setting Gen2 link speed and it works...until I run GPU-Z or any other application - system freezes and i get BSoD :/

It seems that when I run my GTX660 with 1.1Opt, all is working great.

But when trying to do anything to force GTX660 to 1.2Opt system freezes after run any application - question is why?

Link to comment
Share on other sites

Hi everyone, I am new to this forum. Recently I have been torn between building a high-end desktop and buying a high-end laptop. I would really like to have more portability and the ability to relax on a couch or chair, rather than be restricted to my desk since I generally am at a desk during the day developing software. The laptop I am considering is a Clevo p370sm with two 780m gpus in SLI. My goal is to be able to play most games (I understand not all games are optimized for SLI) at maximum settings with 3d vision turned on, although full anti-aliasing is something I can live without. Is the Clevo I'm looking at the best means of doing this, or would I be better off scaling down to a less expensive (and more portable) laptop/gpu and investing in an eGPU solution? Thanks in advance!

Link to comment
Share on other sites

I tried to do that way + setting Gen2 link speed and it works...until I run GPU-Z or any other application - system freezes and i get BSoD :/

It seems that when I run my GTX660 with 1.1Opt, all is working great.

But when trying to do anything to force GTX660 to 1.2Opt system freezes after run any application - question is why?

This means that when your eGPU is running a Gen2 link and it's under load it fails. We the failure you describe typically when using only Gen1-compliant hardware (PE4H 2.4 or PE4L 1.5 or older). Gen2-compliant hardware is PE4L 2.1b or PE4H 3.2. Another reason it could fail with Gen2 hardware is the expresscard slot itself doesn't meet the more stringent termination requirements for the higher speed or you have a batch batch of supposedly 'Gen2' hardware.

Link to comment
Share on other sites

This all depends on what kind of setup you would be using. If you're able to negotiate an x1 2.0 connection using a nVidia card with optimus, it looks like most people with this configuration are getting ~60-75% or more of the performance of the card. Using higher end video cards typically still results in better performance over using lower end cards but there is some overlap. If you check out the first page of the thread, there is a leaderboard with results from benchmarking. If you're not able to use the optimus configuration or if you can only negotiate a 1.0 connection then your performance will degrade further.

Just to give you a comparison:

Standard Desktop PC (Closest I could find on the 3DMark site with similar processor speed):

Intel Core i7-3770K Processor (3,500 MHz)

GTX 780

3DMark06 Score: 33133

Laptop With EGPU (taken from the first page)

i7-3720QM 2.6 (3,600 MHz)

[email protected] (This is x1 2.0 with optimus configuration)

3DMark06 Score: 25860

Based on just the above, it shows a performance hit of about 28% (if my math is right, otherwise it's 22% @_@) by using the same video card in an eGPU setup with a similar speed processor, but still, that is one of the high-end cards right now and it's still performing very well.

What does that 28% mean to you? If you're able to reach 60FPS with your current video card in a game right now hooked up to a 3.6GHz desktop processor and use that same video card in an eGPU setup on a laptop with a similar 3.6GHz processor, you'd be losing about 17FPS which means you would be down to 43FPS, which is still playable.

Just to throw some wrenches into the gears, it really depends on the game itself how well it would perform with limited bandwidth available. Different amounts of data is transferred at different times based on how the game was programmed, on what is being displayed, and the settings you're using within the game. While we can estimate performance, there's no guarantee that your game will still run flawlessly all the time with slight loss to FPS - you could get stuttering and other strange effects.

This is true. By using the internal display, you're consuming some of the bandwidth the video card would be using to transfer the video data back to the laptop for display. Right now you'll always get better performance using an external display that is connected to the video card.

Ah brilliant, I'm compelled to post to further emphasize my gratitude for making it so clear cut.

Like most people considering the solution, they're after looking for an alternative to maintaining some relative portability and saving some money instead of getting a desktop to supplement the lack of power.

Currently I'm considering a notebook and basing it's viability on a decent balance between portability and eGPU performance for when I'm at home, and the refreshed rMBP 13" with the thunderbolt 2.0 comes to mind. From what I can grasp, it gives the optimal (theoretical) output so that I can get the most out of the intended GPU.

What I don't understand is how the W530 scored more than the rMBP 15" with the same GPU. I thought Thunderbolt has superior bandwidth compared to the other options, e.g in this case, I thought Expresscard < Thunderbolt?

Link to comment
Share on other sites

Hi guys i came across this mod (LCD/LED Controller), http://www.ebay.com/itm/M-NT68676-2A-HDMI-DVI-VGA-Audio-LCD-LED-Screen-Controller-Board-Diy-Monitor-Kit-/110977522562?pt=US_Server_Boards&hash=item19d6c69b82

Display output would be directly connected from eGPU to the modified laptop monitor with this,

http://www.ebay.co.uk/itm/DVI-I-Male-to-DVI-I-Male-Dual-Link-Cable-29-pin-Gold-3m-/180715065524?pt=UK_Computing_Sound_Vision_Video_Cables_Adapters&hash=item2a137510b4

Would the result be the same as running eGPU with an External Monitor which offer a higher framerate?

Edit:

If it produce the same result then mounting the monitor back to the laptop with the circuit boards behind it just like one of the pics would be fine i guess.

Link to comment
Share on other sites

  • Moderator
Hi guys i came across this mod (LCD/LED Controller), M NT68676 2A HDMI DVI VGA Audio LCD LED Screen Controller Board DIY Monitor Kit | eBay

Display output would be directly connected from eGPU to the modified laptop monitor with this,

DVI-I Male to DVI-I Male Dual Link Cable 29 pin Gold 3m | eBay

Would the result be the same as running eGPU with an External Monitor which offer a higher framerate?

Edit:

If it produce the same result then mounting the monitor back to the laptop with the circuit boards behind it just like one of the pics would be fine i guess.

Yes it would behave as if you had an external monitor hooked up. Would be a very neat project. You would just have to secure the eGPU to the laptop so you don't tug on / pull / break any wires.

I was thinking of a way to have a 13" laptop mobo/innards ported into the 17" comparable model and have the eGPU internal and do something just like you mentioned. Would give a hybrid laptop CPU with desktop graphics. Would be neat.

Not sure how the power situation would be dealt with however to pull enough power from the battery to power a desktop (higher end GPU).

Or have it switchable for laptop mode or eGPU mode (depending on mobile / on-battery or near a mains power and thus can have eGPU powered).

----

I got a Dell M2010 with dead video that I might be able to repurpose.....

Link to comment
Share on other sites

Currently I'm considering a notebook and basing it's viability on a decent balance between portability and eGPU performance for when I'm at home, and the refreshed rMBP 13" with the thunderbolt 2.0 comes to mind. From what I can grasp, it gives the optimal (theoretical) output so that I can get the most out of the intended GPU.

What I don't understand is how the W530 scored more than the rMBP 15" with the same GPU. I thought Thunderbolt has superior bandwidth compared to the other options, e.g in this case, I thought Expresscard < Thunderbolt?

An x1 2.0 link provides 500MB/s (or 5GT/s) of bandwidth. Thunderbolt 1 is equivalent to an x2 2.0 link which provides 1GB/s (or 10GT/s). While Thunderbolt does provide additional bandwidth, you cannot run Optimus compression with it as Optimus requires an x1 link. The compression provided seems to exceed the performance of having an x2 link based on the benchmarks out there, even with people who have tried using Expresscard and mPCIe slots for their eGPUs.

Thunderbolt 2 however is likely going to win the bandwidth war as it is essentially an x4 2.0 link, providing ~20GT/s in bandwidth. At this level of bandwidth most video cards are able to achieve roughly 80-95% of their performance based on the scaling analysis that has been performed (again links to this are present on the front page of the thread). The trouble with Thunderbolt right now is that all of the TB external PCIe solutions are horribly expensive ($300 and up) and it's not widely adopted on laptops, leaving your choices of laptops fairly limited.

For me, I want a convertible laptop that will be replacing an aging desktop PC I have, and I have my eyes on the Sony Flip 13. It doesn't have Expresscard, Thunderbolt or mPCIe slots, but it has 2 of the NGFF M.2 connections inside. One of them could provide the same bandwidth (or more even!) than Thunderbolt 2 as it is an x4 slot and the M.2 standard is meant to support a 3.0 link, meaning a total bandwidth of ~32GT/s would be available if a 3.0 link is negotiable. I'm reluctant though as the laptop is going to be $1600 and NGFF M.2 slots are so new, no one has done any testing with them (I may have to be first!).

  • Thumbs Up 1
Link to comment
Share on other sites

Yes it would behave as if you had an external monitor hooked up. Would be a very neat project. You would just have to secure the eGPU to the laptop so you don't tug on / pull / break any wires.

I was thinking of a way to have a 13" laptop mobo/innards ported into the 17" comparable model and have the eGPU internal and do something just like you mentioned. Would give a hybrid laptop CPU with desktop graphics. Would be neat.

Not sure how the power situation would be dealt with however to pull enough power from the battery to power a desktop (higher end GPU).

Or have it switchable for laptop mode or eGPU mode (depending on mobile / on-battery or near a mains power and thus can have eGPU powered).

----

Awesome! I knew it would work. I've seen a vid on YouTube where this guy completely dismantle his laptop screen and hook up an eGPU directly to it.

He did not even fixed back the modded screen to his laptop.

And 13" to 17" would take loads of the laptop frame modifications i guess which i have no idea how to do it.

But the end result would be fantastic!

Link to comment
Share on other sites

  • Moderator
Awesome! I knew it would work. I've seen a vid on YouTube where this guy completely dismantle his laptop screen and hook up an eGPU directly to it.

He did not even fixed back the modded screen to his laptop.

And 13" to 17" would take loads of the laptop frame modifications i guess which i have no idea how to do it.

But the end result would be fantastic!

I was thinking of having a 13" mobo on one side of the enormous 20.1" (about 23" wide) Dell M2010. Then a slim desktop PSU (or 360 PSU) and eGPU on the other side. Hardest thing would be to find a really nice laptop mobo to put in there. mPCIe would be the cleanest option here (unless of course a NGFF M.2 adapter comes out, that would be ideal.

Link to comment
Share on other sites

Recently I have been torn between building a high-end desktop and buying a high-end laptop. I would really like to have more portability and the ability to relax on a couch or chair, rather than be restricted to my desk since I generally am at a desk during the day developing software. The laptop I am considering is a Clevo p370sm with two 780m gpus in SLI. My goal is to be able to play most games (I understand not all games are optimized for SLI) at maximum settings with 3d vision turned on, although full anti-aliasing is something I can live without. Is the Clevo I'm looking at the best means of doing this, or would I be better off scaling down to a less expensive (and more portable) laptop/gpu and investing in an eGPU solution? Thanks in advance!

3D Vision requires specific displays in order to display the 3D (typically they need to support 120Hz signalling). I do see there is a sepcific version of the Clevo p370sm (p370sm3 I believe it is) that has the 3D display built in. If you didn't get a laptop that has the required 3D screen then you're limited to having a desk with a 3D monitor or a 3D TV. When using 3D Vision, the GPU essentially has to render every frame twice at two different perspectives in order to provide the 3D effect meaning you need to have a decent GPU to push out at minimum 60FPS (30FPS per eye). In an eGPU environment you're not able to get the full processing power out of the video card you have attached, so using 3D Vision may not be the best idea with an eGPU, especially if you want to run at max settings too.

Your scenario provided above, that being you want to game on a couch, could be accomodated by a desktop PC, a gaming laptop with a dedicated GPU or a laptop with an eGPU, depending on how you want to set up your gear. If you can provide details about what you want to do while mobile, I could make a better recommendation for you. For now, based on your desire to run max settings and use 3D vision and not have to be at a desk, my recommendation would be for you to stick with the 3D version of the Clevo for being able to 3D game anywhere without any additional requirements.

Link to comment
Share on other sites

Anyway i have an acer v5-551-8401 i have a radeon 7600g (Mobile) And it has 6gb of ram. THe gpu has 512mb of memory. It has a quad core processor (AMD 4555M) 1.6ghz-2.5(Turbo boost) and well i want to have an e gpu i need to know what gpu will be suitable for me without bottlenecking my gpu/cpu ( I need some low cost ones too) Cause i got this pc in november 2012 and i dont want to upgrade just for gpu sake so yea.

Link to comment
Share on other sites

Greeting to all gentlemen and ladies at TechInferno forums.

I have been interested in eGPU for awhile and recently has finally made my purchase . I was able to get the system running once or twice. But the third time never came. Not only the card busted itself, the PE4H v2.4 board also bursted into flame in front of my horror face. After getting a new card and replace the chip on the board, i have faced with new issue.

My setup is as follow:

PE4Hv2.4 with EM2C part ( i also has the PM3N still boxed for an planned x2 connection)

Laptop model Lenovo 3000 G430 running XP SP 3

Graphic card is MSI HD 7730 1GB DDR5

I powered the setup using either: - Laptop charger listed output at 19V-3,16A

- A decade old PSU brand CODEGEN 250W with +12 V rated 9A; +3,3V rated 14A

My current situation is:

When i plug the power in, the GPU fan run a few seconds before stop, after consulted a vid on youtube, i decided that's a normal reaction

I met error 12 before so i used Solution 7 of using the Magma Full Drive, its what get my system able to run before.

When i plugged everything in in order: eGPU ON, Laptop ON, plug the cable in when the screen prompted to, actually i tried all the orders in FAQ, same difference.

The GPU fan would not run and my Window are unable to find the card either.

Would anyone able to help me point out where the problem is ?

And i has another question: Would using both of my power sources at same time turn my setup into firework materials again ?

Many thanks and great day for all of you.

Excuse me, could someone take a look at my problem ?

Link to comment
Share on other sites

@Diarek

One Question, have you the Setup 1.x?

@All

I've now my second eGPU, I've also done some testing with my Lenvono x220 and different GPU, would anyone have a problem when I would represent the tables here? Also I have made ​​some test on the PE4L-EC060A and em-PE4L EC100A, plus I would have a few tables. Or is it not necessary?

The image of my new Case you get, but in any case ;) :

post-10765-14494996412779_thumb.jpg

If you still have any questions, ask.

PS: I love the number of smilies here :21_002:

  • Thumbs Up 2
Link to comment
Share on other sites

Anyway i have an acer v5-551-8401 i have a radeon 7600g (Mobile) And it has 6gb of ram. THe gpu has 512mb of memory. It has a quad core processor (AMD 4555M) 1.6ghz-2.5(Turbo boost) and well i want to have an e gpu i need to know what gpu will be suitable for me without bottlenecking my gpu/cpu ( I need some low cost ones too) Cause i got this pc in november 2012 and i dont want to upgrade just for gpu sake so yea.

As far as I know, the diy egpu solution is for Intel processor equipped notebooks only.

But on the topic of AMD, I also wanted to bring to the attention of forum experts this news:

AnandTech Portal | AMD 2014 Mobile APU Update: Beema and Mullins

Of importance in the above article is this:

While DockPort sounds interesting (a non-Intel alternative to Thunderbolt that basically combines DisplayPort 1.2 with USB 3 into a single cable), AMD said precious little about DockPort in their presentation. Someone asked about it, and AMD said it was “up to laptop manufacturers” and that was about it. There’s the above slide as well, showing how a single cable could drive three external displays along with a variety of peripheral devices, but we’ll have to wait and see how many companies are willing to jump on the DockPort bandwagon.
Link to comment
Share on other sites

  • Moderator
I've now my second eGPU, I've also done some testing with my Lenvono x220 and different GPU, would anyone have a problem when I would represent the tables here? Also I have made ​​some test on the PE4L-EC060A and em-PE4L EC100A, plus I would have a few tables. Or is it not necessary?

The image of my new Case you get, but in any case ;) :

[ATTACH=CONFIG]9613[/ATTACH]

Wow that is a slick case. All the cuts and screw holes are so clean / flush. Very nice work.

Postup your tables!

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.