Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

Hi@all,

i also consider to setup and assemble an eGPU solution. But I am still wondering about the advantage of optimus enabled:

Is it "just" the use of the external eGPU in connection with the main (built in) screen - or will it also boost the performance? Afaik the optimus technology will drop the performance. Regarding the mobility aspects of an eGPU solution optimus would be great but than i have to deal with less performance - right?

@Dschijn: it is probably to naive to mentioned it but maybe a PRAM-Reset (alt+cmd+p+r pressed while rebooting) could help?

Hi,

In our case optimus compresses the data sent to the eGPU. Because the bottleneck is in most cases this connection, it speeds up everything.

Gesendet von meinem Nexus 5 mit Tapatalk

Link to comment
Share on other sites

Hi guys,

been reading the forums for about a week and gone through all the implementation guides but these two adaptors seem quite new / unused with mPCIe systems.

My laptop does not have express port or thunderbolt so my only option is the mPCIe socket.

I noticed the PE4L/H/C series have CLKRUN switches to delay the PCI detection. Do I actually need this for mPCIe or can I use the much cheaper EXP GDC V6 without needing to hot plug on each boot?

From what I have read my laptop does not have whitelisting enabled,

Plan 1:

EXP GDC V6 - mPCIe

Dell DA-2 220watt

NVIDIA GTX970

Plan 2:

PE4C - mPCIe with CLKRUN delay switches

Dell DA-2 220watt

NVIDIA GTX970

Specs:

i7-2630QM

HM67 chipset

6GB Ram

NVIDIA GT555m 3gb with Intel HD3000 in Optimus config.

Thanks guys

Link to comment
Share on other sites

Im no expert, with only having read over the forum myself and not actually built an egpu, but both the exp gdc v6 and v7 have a TD swith for 'resolving hardware issues' (look at banggood and aliexpress), which might do the same thing. But like I said, I know just about as much as you, so might be better to wait and see what others have to say See Tech Inferno Fan's post below for correct answer

Link to comment
Share on other sites

PE4C V2.x is recommended for mPCIe implementations due to it's new CLKRUN feature. Without that, some systems require live hotplugging of the mPCIe ended cable to start the CLK and allow the eGPU to be detected. AFAIK, EXP GDC has no CLKRUN delay.

PE4C V2.1 is now available without the AC adapter for a cheaper price:

US$83-shipped PE4C V2.1 (0W)

US$134-shipped PE4C V2.1 (220W))

Link to comment
Share on other sites

Hi,

In our case optimus compresses the data sent to the eGPU. Because the bottleneck is in most cases this connection, it speeds up everything.

Hi, okay. But also in connection to a Thunderbolt 2 connection?

I am thinking of using a MSI Gaming 970 G4 in connection with the Akitio Thunder2 PCIe Box and a Thunderbolt 2 MacBook Pro. Hence I currently have a dGPU (GT750M) beside the iGPU I could not use the Optimus feature so far, but I am thinking of selling it to swap for an iGPU only MacBook Pro, if and only if Optimus could speed up things.

Link to comment
Share on other sites

Hi, okay. But also in connection to a Thunderbolt 2 connection?

I am thinking of using a MSI Gaming 970 G4 in connection with the Akitio Thunder2 PCIe Box and a Thunderbolt 2 MacBook Pro. Hence I currently have a dGPU (GT750M) beside the iGPU I could not use the Optimus feature so far, but I am thinking of selling it to swap for an iGPU only MacBook Pro, if and only if Optimus could speed up things.

Optimus engages x1 pci-e compression boosting mPCIe/EC links. It will have no affect on a TB/TB2 link using an AKiTiO Thunder2 since it uses a x4 2.0 link. The only benefit in getting the iGPU only 15" Macbook Pro would be an accelerated internal LCD mode AND longer battery life in Windows.

Link to comment
Share on other sites

The only benefit in getting the iGPU only 15" Macbook Pro would be an accelerated internal LCD mode AND longer battery life in Windows.

Thank you for clarifying but I still wonder if it this internal LCD accelerating will lead to less performance?

So the best performance will only be possible by an external LCD, the internal can be used by enabling Optimus but will come with less performance - is that the right conclusion?

Link to comment
Share on other sites

Thank you for clarifying but I still wonder if it this internal LCD accelerating is will lead to less performance?

So the best performance will only be possible by an external LCD, the internal can be used by enabling Optimus but will come with less performance - is that the right conclusion?

Yes that is correct as per Tech Inferno Fan writings

Sent from my SM-N9005 using Tapatalk

Link to comment
Share on other sites

Yes that is correct as per Tech Inferno Fan writings

Thank you - I still was irritated, since Nando4 wrote:

Optimus [...] will have no affect on a TB/TB2 link using an AKiTiO Thunder2 since it uses a x4 2.0 link.
but it indeed drops the let's say FPS rate....

However I would love to just grab my laptop and the eGPU setup to have everything on my side, since an external LCD is not always an option.

Link to comment
Share on other sites

Sooo I am about to purchase an entry level card and an exp gdc due to budget constraints (and the fact that Ive never played games on ultra graphics settings so I wont be bothered by the lack of it) but I have a 1st generation i5 with an expresscard slot, which I understand only supports x1.opt for the best performance. I dont think anyone in this thread have lived w an entry card so Im wondering, will a 1gb ddr5 gt 730 be a big waste or will I gain considerable performance over my intel hd graphics igpu? (looking for playable fps in native resolution, internal lcd) I dont have the money to buy anything other than a sub-$100 card so there's that. Thanks in advance!

Link to comment
Share on other sites

Hi everyone! I am interested in easy and universal solution for using eGPU with internal LCD. Then the whole eGPU set can be made quite portable.

I propose to extend what was discussed and described here and here. Framegrabbers albeit always ready may be considered as relatively expensive solution and they occupy slot/port and add one more cable to your system.

Let's share ideas and experience, namely, 1) how to start a game on iLCD by means of some software, 2) CPU load with such solutions, 3) problematic games.

1) The most common solution is through Ultramon. Its developers say it is video card independent as should be according to its functions.

Also Actual Window Manager is mentioned as an alternative. Still I don't know what is not enough, if this is so, for Win 7, 8 display cloning to be able to run games on iLCD. This may be asked to be done in the next Windows.

Then all report that a game should run in a window. The best solution will be the game's supported window mode. This option can be present somewhere in the game's dialogs (e.g., Oblivion). Also Alt-Enter may help (e.g., AOW SM). In some cases adding parameters "-W", "-w", "-Window", "-window" can help (e.g., Fallout tactics).

D3Dwindower succesfully windowed Starcraft, but after its application to AOW SM the image in the window was unacceptable at all and after click the game restored its fullscreen mode. D3Dwindower didn't affect Fallout Tactics at all. With DxWnd and 3D Analyze I've got even worse results albeit DxWnd is newer than D3Dwindower. I hope I'll be able to run all the games in window mode.

But is window mode really needed? This part of the text is somewhat technical and may be more appropriate for the stackoverflow site, and a common reader can avoid it. In Windows XP I checked with spy++ the 3 mentioned games and found that all they have windows albeit they run in fullscreen mode. Maybe there is no other possible realization through DirectX or OpenGL. So I don't understand why that window is not enough (I havn't tried that yet anyhow).

And if a game is windowed successfully it can try to recover to its original state, as it was in AOW SM. Then a new created window with grabbed frames and tuned input might be a workaround.

2) I propose to find out what CPU load can we have using the method described above with different software.

With my preliminary test of mirroring an application at 1280x800 & 25fps through Ultramon (the window's size was not maximized and was around 1024x768) I got around 7% CPU load, not 40% I found somewhere.

To make unambigous testing we need to find MHz or GHz (not %) in the following way. You can use Window's Task Manager (open list of processes) or free Procexp (for WinXP or later). We need to know what 100% means there. Also we need to know current CPU's or core's frequency. Most likely your CPU reduces its frequency when its load is small, but I don't know that threshould % so to be sure we need to know CPU's frequency and to have overestimated result in MHz. In WinXP this can be checked in CPU-Z (I have ver. 1.71). In Win 8 or maybe earlier versions Task Manager can also show current CPU's frequency. In the case of present Turbo Boost technology (many CPUs already have it; probably it can be switched off in BIOS) separate cores can increase their frequency above the nominal one. As I have CPU without Turbo Boost, I don't know how to get the current frequency of the core running the processes of interest and what 100% CPU load will mean in Task Manager or ProceXP, say, will there be more than 100% if all the cores are loaded fully? You may need to figure out this. But don't forget to save your unsaved info before 100% CPU load.

To load fully a core you can create the following 2-bytes program: EB FE (in hex coding). Alternatively you can enter them in a text editor by their codes through a standard Windows symbol table application. Name it with "exe" extension or "com" as an alternative. In WinXP I had to press "ignore" button and nothing in Win8. Then ntvdm parent process will fully or almost fully load your core.

Now examples how to understand the percents. In the case of 2 cores CPU without Turbo Boost one ntvdm process should load CPU approximately on 50% meaning fully loaded one core working on its maximal CPU's frequency. So in this case you should multiply a process's percent by 2 (the number of cores) and then on the CPU's frequency.

In the case of 2 cores and 4 threads (or 4 virtual cores, hyperthreading technology) without Turbo Boost one ntvdm process loads CPU approximately on 25%. AFAIK each real core shares its resources between its own threads. So 25% should mean one fully loaded virtual core with 2 times less CPU's frequency. Then a process's percent should be multiplied by #threads/(#threads/#cores)=#cores, as in the previous case.

Then 7% in my 2-core CPU with 2.16 GHz would mean just 302 MHz or just 14% CPU load on a one core CPU with 2.16 GHz.

Please, present your results on the used software if they are notably different. Also specify your used software, OS and your GPUs.

Resolution 1280x800, game's window maximaized or made maximal. CPU load should depend on this.

3) If you a game which could not be run on internal LCD by the mentioned methods, please, specify it to know the situation.

Link to comment
Share on other sites

NVidia Optimus or LucidLogix provides fully integrated internal LCD support for windowed or full screen apps.

However, Optimus requires an Intel iGPU + NVidia GTS450 or newer. 4500MHD iGPUs included in older Core2DUO systems no longer supported as of driver 306.97WHQL. An extra benefit of Optimus is it provides x1 pci-e compression, greatly accelerating EC/mPCIe eGPUs under DX9.

http://forum.techinferno.com/diy-e-gpu-projects/2967-lucidlogix-virtu-internal-lcd-mode-amd-egpus.html'>LucidLogix Virtu only works with 2nd-gen or newer i-core systems with an iGPU and is a pain to get going on a mobile platform.

Ultramon/Chung-gun method. Ultramon mirroring solution using a dummy external LCD configuration to allow dragging your eGPU accelerated *windowed* app to the internal LCD works. Downside is it doesn't work for full screen apps.

If there are any newer solutions then these then pls post.

Link to comment
Share on other sites

It seems I haven't understood completely right what you said about Optimus before and and Intel iGPU + NVidia GTS450 will work with Optimus. And for a slow PCEe connection Optimus might help, that is why I plan to buy one of those nVidia cards. But what about DirectX 10, 11, 12? E.g., GTX 550, 570 and many others relatively old video cards are said to support Directx 11 and partly 12. And what is the most important nVidia Optimus seems to be not a reliable solution. It can disappear because of not appropriate driver support or newer hardware without that support. NVidia can stop it. Also I saw complains that Optimus under WinXP created serious problems...

LucidLogix Virtu solutions seems to be more attractive. But I saw somewhere that it's not for all nVidia and AMD video cards. Maybe it is for the listed ones AND newer ones. But one could have older motherboard or just a different CPU. There are simplified CPUs Intel Atom, Pentium, Celeron and others under production, and there are also AMD CPUs. So if you like to play with your pal who was not aware of these hardware restrictions (pretty strict ones), you will have a problem.

Ultramon/Chung-gun method seems to be a simpler software realization and one can have the largest expectation that it will work in the future versions of Windows and there certainly will be corresponding software to realize this method also under newer versions of Windows.

I believe you this method works. But as I mentioned above, there is no warranty you will be able to start a game in window mode. And this limitation may be not necessary. I don't see why it is necessary yet. Maybe we can ask to make changes in Ultramon or an alternative software and that will be quite small and easy modification.

And there is also question about CPU load by Ultramon and by a windowing program if any is used. If it loads one core almost completely, then this may be a problem. If this CPU load is smaller, then it's ok. A modern game may need 2 cores with around 2 GHz frequency. Look at CPU requirements of, say, Crysis 3 or AOW 3.

Link to comment
Share on other sites

It seems I haven't understood completely right what you said about Optimus before and and Intel iGPU + NVidia GTS450 will work with Optimus. And for a slow PCEe connection Optimus might help, that is why I plan to buy one of those nVidia cards. But what about DirectX 10, 11, 12? E.g., GTX 550, 570 and many others relatively old video cards are said to support Directx 11 and partly 12. And what is the most important nVidia Optimus seems to be not a reliable solution. It can disappear because of not appropriate driver support or newer hardware without that support. NVidia can stop it. Also I saw complains that Optimus under WinXP created serious problems...

LucidLogix Virtu solutions seems to be more attractive. But I saw somewhere that it's not for all nVidia and AMD video cards. Maybe it is for the listed ones AND newer ones. But one could have older motherboard or just a different CPU. There are simplified CPUs Intel Atom, Pentium, Celeron and others under production, and there are also AMD CPUs. So if you like to play with your pal who was not aware of these hardware restrictions (pretty strict ones), you will have a problem.

Ultramon/Chung-gun method seems to be a simpler software realization and one can have the largest expectation that it will work in the future versions of Windows and there certainly will be corresponding software to realize this method also under newer versions of Windows.

I believe you this method works. But as I mentioned above, there is no warranty you will be able to start a game in window mode. And this limitation may be not necessary. I don't see why it is necessary yet. Maybe we can ask to make changes in Ultramon or an alternative software and that will be quite small and easy modification.

And there is also question about CPU load by Ultramon and by a windowing program if any is used. If it loads one core almost completely, then this may be a problem. If this CPU load is smaller, then it's ok. A modern game may need 2 cores with around 2 GHz frequency. Look at CPU requirements of, say, Crysis 3 or AOW 3.

Optimus is the best transparent solution if, as you point out, you have hardware and software that NVidia enables it on.

Ultramon mirroring was used only to give a visual reference to what was happening on the eGPU-attached dummy LCD. That's so can see a screen where you can navigate and start your app on on the eGPU, then drag it to the internal LCD. Once the app has been dragged, mirroring is switched off and the underlying Windows drivers/display manager take over to use the eGPU to render the image but display it on the internal LCD. The only additional CPU load was that by the underlying drivers. These manual steps working pretty much in the same way that NVidia Optimus does for windowed apps but in more tedious manner.

I'd advise testing the ultramon solution to have a clear understanding as how it works from which you can then edit your points about 'mirroring' CPU load to prevent distracting users.

Link to comment
Share on other sites

Optimus should be the best solution. I consider the discussed method rather as workaround.

Now I see that Ultramon cannot tranfer mouse and keyboard input between two screens and this should be done by OS and we should check how some system processes load CPU instead. I'll correct my 1st post.

Now I see the discussed method such that we need Ultramon because we need to see the dummy screen (and to set primary display if this cannot be done by OS or iGPU's software) to start a game on the eGPU (I though this can be done easily not seeing the dummy screen, I'll check this). And the limitation with window mode comes from that we have to drag that game to iLCD with mouse. I though this is possible to do in other ways. I'll try the ones mentioned here and try to find others if needed.

I have Ultarmon but I don't have PE4C yet and eGPU so cannot make comprehensive test. That may be in one month or later.

  • Thumbs Up 1
Link to comment
Share on other sites

As my 2nd MSI GTX 970 has a very loud coil noise I will replace this one again.

My supplier offered my to replace with a different brand.

Is EVGA better as a eGPU card?

If you want a lower performance card that fits the AKiTiO chassis, then peruse the list at http://forum.techinferno.com/enclosures-adapters/7205-us%24200-akitio-thunder2-pcie-box-16gbps-tb2.html#post98210 . If you want the fastest GTX970 then get the Gigabyte Gaming G1. It's easily overclockable to > 1500Mhz boost and has a 220W+12.5% power limit (250W!). If want a smaller higher performance GTX970 then consider the 9.5" EVGA GTX970 FTW. At 9.5" it would still protrude out the side of a modified AKiTiO Box OR require a PCIe riser to host it outside the AKiTiO chassis.

Link to comment
Share on other sites

As I am using an extra case (Cooler Master Elite 130) to put all the parts inside, I don't need a small card.

Haven't there been people recommending EVGA cards, as they seem to work better?

Link to comment
Share on other sites

As I am using an extra case (Cooler Master Elite 130) to put all the parts inside, I don't need a small card.

Haven't there been people recommending EVGA cards, as they seem to work better?

EVGA cards were recommended by squinks since they are in effect providing a PCI Reset Delay on bootup such that the system boots with the iGPU enabled and therefore Optimus works: http://forum.techinferno.com/diy-e-gpu-projects/6918-updated-2013-13-15-macbook-pro-thunderbolt-2-egpu-plug-play-optimus.html#post94929 . Otherwise may need workarounds like suggested at http://forum.techinferno.com/enclosures-adapters/7205-us%24200-akitio-thunder2-pcie-box-16gbps-tb2-59.html#post109786

The Gigabyte Gaming G1 GTX970 card has the highest OC performance potential of all the current GTX970 cards on the market. That's because of it's extraordinarily high 250W TDP limit and exceptional cooling system.

Link to comment
Share on other sites

Thx! I think I will take a EVGA GTX 970 FTW, since I own a EVGA GTX 670 FTW which is working like a charm in my Hackintosh.

Additionally EVGA released a new bios for the 970 cards with a enabled 0dB mode and quieter operation.

Pity that the MSI card is so annoying, because it's cooler is amazing!

Link to comment
Share on other sites

So, I haven't been able to tinker around with my EXP GDC v6 for a while so I still haven't got it working. Short recap in case you don't remember; I have an Elitebook 2560p (i5-2520m, 8 GB ram) on Windows 8.1 (installed on a GPT partition), an EXP GDC v6 over expresscard and an nVidia GTX 560 ti 2GB. Before I installed Windows on a GPT partition the device gave an error 12. Since installing on GPT, this no longer occurs. The issue atm is that the GPU still basically doesnt do anything. It shows up as this in GPU-Z:

eF3WUUq.png

Since my last post I've bought an optiplex 755 for my girlfriend to work on and to test my Dell DA-2 with. Turns out the DA-2 is fine after all. Tried the EXP GDC with both a separate PSU and the DA-2 but no difference.

So @Tech Inferno Fan , do I need Setup 1.x to fix this? Because I think I exhausted your other possible fixes and none worked for me.

  • Thumbs Up 1
Link to comment
Share on other sites

So, I haven't been able to tinker around with my EXP GDC v6 for a while so I still haven't got it working. Short recap in case you don't remember; I have an Elitebook 2560p (i5-2520m, 8 GB ram) on Windows 8.1 (installed on a GPT partition), an EXP GDC v6 over expresscard and an nVidia GTX 560 ti 2GB. Before I installed Windows on a GPT partition the device gave an error 12. Since installing on GPT, this no longer occurs. The issue atm is that the GPU still basically doesnt do anything. It shows up as this in GPU-Z:

eF3WUUq.png

Since my last post I've bought an optiplex 755 for my girlfriend to work on and to test my Dell DA-2 with. Turns out the DA-2 is fine after all. Tried the EXP GDC with both a separate PSU and the DA-2 but no difference.

So @Tech Inferno Fan , do I need Setup 1.x to fix this? Because I think I exhausted your other possible fixes and none worked for me.

Setup 1.30 is the last option available if all others are exhausted. Other hardware tests you may consider are: (1) testing the video card in a desktop system to confirm it's working OR test another video card with your eGPU hardware (2) test the EXP GDC V6 with another notebook to confirm it can do reliable PCIe comms (3) test the EXP GDC V6 using a Gen1 rather than Gen2 pci-e link, an option available in the 2560P BIOS.

  • Thumbs Up 1
Link to comment
Share on other sites

Setup 1.30 is the last option available if all others are exhausted. Other hardware tests you may consider are: (1) testing the video card in a desktop system to confirm it's working OR test another video card with your eGPU hardware (2) test the EXP GDC V6 with another notebook to confirm it can do reliable PCIe comms (3) test the EXP GDC V6 using a Gen1 rather than Gen2 pci-e link, an option available in the 2560P BIOS.

Ah I forgot to mention the 560ti did run just fine in my desktop, I tested the EXP with an old 8800gt and it didn't work (although that may just be because that GPU is so old). I don't really have another laptop with an expresscard, but I'll give Gen1 link a shot. If that won't work I'll PM you about Setup 1.30.

  • Thumbs Up 1
Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.