Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

- Show your interest in this particular feature/request (or even 24bit for no visual degradation), be polite and patient !

.

Wow, forcing 24-bit color would be really awesome for eGPU. Please keep us updated if this can be possible!

  • Thumbs Up 1
Link to comment
Share on other sites

Yeah that would be great, but the performance boost will surely be not as good as in 16bit but I hope it's like 50% more FPS at best I've myself notice arround 100% more FPS in 16bit with msi kombustor benchmark (KMark).

Btw Thanks for posting on ENBdev forum :)

Link to comment
Share on other sites

That's what these owners with a iGPU + AMD dGPU + NVidia eGPU did to get success: madseason (Lenovo Y460 + HD5650M), Tondy (Vostro 3560 + HD7670M) and Frula (HP 4530s + HD6470M).

Thanks Tech Inferno Fan, disabling my dGPU worked, optimus is functioning correctly, however, I cannot seem to use my intenal LCD display. Where is said option located?

Link to comment
Share on other sites

Hi guys!

I might have something of interest for many of you ! :)

Do you remenber the colour depth analisys made by Khenglish ?

He discovered that playing with a reduced colour depth would grant you a great performance boost ! This mostly works for the internal monitor rendering and it is not very revelant on the external monitor.

I know he has been looking for a way to force recent games to render in fullscreen 16bit, but actually for most recent games 16bit is not supported.

So, you can't play 16bit in full screen and for most games, windowed with 16bit resolutions oesn't help.. (I don't know why though)

But I know a genius guys, who could probably force most games to render in 16bit fullscreen thanks to his fantastic tool he called ENB.

Nice find! I think 24-bit would be optimal. 16-bit looks substantially worse than 32-bit in most situations. It was really only RE5 that looked close. 24-bit should look identical to 32-bit while giving a significant boost. 18-bit would also be interesting since most monitors are native 18-bit.

Link to comment
Share on other sites

Thanks Tech Inferno Fan, disabling my dGPU worked, optimus is functioning correctly, however, I cannot seem to use my intenal LCD display. Where is said option located?

Follow Procedure to add an application to NVIDIA Optimus Technology’s dedicated video 3D settings. There you'll also see how to enable the Display GPU notification in Notification area option to confirm the eGPU is activated. Ensure you are not using an external LCD attached to your eGPU for this to work.

HT051526%20-Clipboard%2008.jpg

Nice find! I think 24-bit would be optimal. 16-bit looks substantially worse than 32-bit in most situations. It was really only RE5 that looked close. 24-bit should look identical to 32-bit while giving a significant boost. 18-bit would also be interesting since most monitors are native 18-bit.

There's also the question of whether NVidia are already doing a 32-bit -> 24-bit conversion when their x1 "pcie compression" engages in their driver. An obvious way for them to maximize the limited x1 bandwidth without any loss of detail.

  • Thumbs Up 1
Link to comment
Share on other sites

Follow Procedure to add an application to NVIDIA Optimus Technology’s dedicated video 3D settings. There you'll also see how to enable the Display GPU notification in Notification area option to confirm the eGPU is activated. Ensure you are not using an external LCD attached to your eGPU for this to work.

HT051526%20-Clipboard%2008.jpg

Thank you again. Everything is working as intended. One last question, how do I know if i'm running x1.1Opt?

Link to comment
Share on other sites

Yeah that would be great, but the performance boost will surely be not as good as in 16bit but I hope it's like 50% more FPS at best I've myself notice arround 100% more FPS in 16bit with msi kombustor benchmark (KMark).

Btw Thanks for posting on ENBdev forum :)

No problem buddy, very nice find!

Link to comment
Share on other sites

Things are looking up for Thunderbolt.

A smaller Thunderbolt 2.0 box from Sonnet $500 and 80W power.

Might just fit one of these 'mini' 760s - unfortunately not enough to supply 170W.

At least we might get some Thunderbolt 2.0 traction.

My ASUS mini GTX670 is shorter, better performance(?) and draws ~160W. Maybe a good candidate to fit the box?

Link to comment
Share on other sites

A little late but here is my setup

Specs:

Thinkpad X220

Core i5 2410m

Intel HD graphics 3000

8gb ram

eGPU:

GTX 660 Ti

PE4L 2.1b Express card

Corsair 430M

Enclosure: coolermaster elite 120 mini ITX

My 3dMark 11 scores were around P4800.

As for the enclosure, It was on sale for $35 and I got an open box for $26....

Here are some pics,

post-16321-14494996066517_thumb.jpg

post-16321-14494996064946_thumb.jpg

post-16321-14494996065306_thumb.jpg

post-16321-14494996065796_thumb.jpg

post-16321-14494996066132_thumb.jpg

  • Thumbs Up 1
Link to comment
Share on other sites

A little late but here is my setup

Specs:

Thinkpad X220

Core i5 2410m

Intel HD graphics 3000

8gb ram

eGPU:

GTX 660 Ti

PE4L 2.1b Express card

Corsair 430M

Enclosure: coolermaster elite 120 mini ITX

My 3dMark 11 scores were around P4800.

As for the enclosure, It was on sale for $35 and I got an open box for $26....

Here are some pics,

Nice, Thats the same case i use. Trying to decide wether or not to mount a HDD in there to make use of the USB on the card.

Link to comment
Share on other sites

Time to post my system as well I guess.

Also, hello to all :)

T400 w/ [email protected], 4G RAM, GMA4500+HD3470, etc etc missing only fingerprint reader for some reason (got it 2nd hand).

Sapphire HD4870 512mb undervolted to 1.083 and underclocked to 650/650, all in bios (due to the dock's power supply being unable to handle more).

Thinkpad Advanced Dock - that's where I get the PCI-E from. x1 1.0 I think.

Some FSP power supply unit.

3Dmark06 score is 7305: ATI Radeon HD 4870 video card benchmark result - Intel Core 2 Duo Processor P8600,LENOVO 2767W2E

Now, on to some questions:

It seems like the t400 does support an x2 1.0 setup, and by the GPU scaling charts I assume that performance loss with that on an HD4870 will be minimal. However, I'll need to purchase the adapter for that, but this way I'm ensuring that I'll be able to get the GPU back to its original clocks, which would mean even more performance.

Any suggestions on how should I proceed? A new laptop is kind-of out of the question for now, and this one is serving its purpose perfectly otherwise.

Cheers :)

Link to comment
Share on other sites

i have ivy brige i7 3612QM 2.10 GHz8gb ram and intel hd 4000( everything good except this :72: )

anything else in here: [ATTACH]8955[/ATTACH]

What exactly are you trying to say?

Time to post my system as well I guess.

Also, hello to all :)

T400 w/ [email protected], 4G RAM, GMA4500+HD3470, etc etc missing only fingerprint reader for some reason (got it 2nd hand).

Sapphire HD4870 512mb undervolted to 1.083 and underclocked to 650/650, all in bios (due to the dock's power supply being unable to handle more).

Thinkpad Advanced Dock - that's where I get the PCI-E from. x1 1.0 I think.

Some FSP power supply unit.

3Dmark06 score is 7305: ATI Radeon HD 4870 video card benchmark result - Intel Core 2 Duo Processor P8600,LENOVO 2767W2E

Now, on to some questions:

It seems like the t400 does support an x2 1.0 setup, and by the GPU scaling charts I assume that performance loss with that on an HD4870 will be minimal. However, I'll need to purchase the adapter for that, but this way I'm ensuring that I'll be able to get the GPU back to its original clocks, which would mean even more performance.

Any suggestions on how should I proceed? A new laptop is kind-of out of the question for now, and this one is serving its purpose perfectly otherwise.

Cheers :)

With your existing HD4870 if your expresscard slot is using port1, 3 or 5 with no device occupying the adjacent port 2, 4 or 6, then you can activate x1E mode using Setup 1.x to get 15-30% better performance. If the adjacent port is mPCIe, or you have other mPCIe+mPCIe ports that can give a [port1+2], [port3+4] or [port5+6] combo, then yes, you could acquire PM3N/PM3Ns to give you a x2 link.

Though if I was in your position I'd offload the HD4870 for a NVidia GT430/GTS450/GTX460 or GTX560Ti. Any of those will give you x1.Opt (Optimus) capability with your system whereby you gain x1 pci-e compression, internal LCD mode a significantly faster video card all with a simpler single-cable solution, or in your case, from the existing dock you have modified.

Link to comment
Share on other sites

What exactly are you trying to say?

With your existing HD4870 if your expresscard slot is using port1, 3 or 5 with no device occupying the adjacent port 2, 4 or 6, then you can activate x1E mode using Setup 1.x to get 15-30% better performance. If the adjacent port is mPCIe, or you have other mPCIe+mPCIe ports that can give a [port1+2], [port3+4] or [port5+6] combo, then yes, you could acquire PM3N/PM3Ns to give you a x2 link.

Though if I was in your position I'd offload the HD4870 for a NVidia GT430/GTS450/GTX460 or GTX560Ti. Any of those will give you x1.Opt (Optimus) capability with your system whereby you gain x1 pci-e compression, internal LCD mode a significantly faster video card all with a simpler single-cable solution, or in your case, from the existing dock you have modified.

Only one problem with the dock.

Forget about installing anything other than a radeon on it, and then it'd have to be HD5k or older. Haven't tested with hd6x, but my 7850 wouldn't post whatever I did (I have a reasonably beefy setup back home, this is my student thing... that's why it's kinda budget). Other people have tried with a variety of nvidias, won't work. I guess that's some kind of white-listing, even thought the hd5k were released a fair amount of time after the T400.

Also, the power supply moment.

Right now running at 650/650/1.1v and there's INSANE coil whine coming from the dock when I'm running any game. CS1.6 locked to 76fps? QuakeLive? Dark Souls (that's actually the heaviest game in terms of how much the GPU draws)? Whines like hell. I might have to retrofit the dock's power supply at some point as well, or just to get a powered PCI-E riser cable (most likely the latter, unless I want to make it an uber-dock). Right now my HD4870 is crippled even before considering the half-duplex issue :)

If you want to have some more WTF moments...

In order for the dGPU to boot at all, I have to enable both GPUs in the BIOS, and to enable OS detection for switchable graphics*.

I can also make it load windows by turning it on with those settings, then going into the bios and changing the gpus to internal only + disable OS detection for switchable. That way the gma4500 is still running (supposedly), but it's not driving the internal LCD, and it shows one VGA port being available (none work, tried). The moment you shut it down and try to start it again... System won't even POST. I have to remove the laptop from the dock, switch to switchable gfx, then it posts. Also, shutting down really means holding down the power button, because usually it freezes on restart/shutdown (right when the sound card switches off), so I guess there's a race condition there.

Fun stuff :)

* NOTE: OS switchable graphics setting is supposed to check whether you have win7/vista and then either enable them both or only the dGPU, but I guess it just checks if you're on XP, because I've had multiple linux distros and both cards stay on.

EDIT:

Converted my pci-e riser cable to a powered one, the 4870 is now at 800/900/1.265v:

http://www.3dmark.com/3dm06/17382884

Can still pull more out of this setup I guess...

Link to comment
Share on other sites

Only one problem with the dock.

Unless if you're really tied to the dock, use your expresscard to hook up a nvidia card. That dock seems like a lot of work, won't fit a decent GPU, and you can't use the internal screen. Check the implementations on the first page and there's a number of T400s on there. I'd use a PE4L v2.1b so you can get 1.2opt when you buy a new laptop.

Or do what Nando said above

Link to comment
Share on other sites

Hy

I have a HP Elitebook 2530p and a PE4L V2.1 eGpu Adapter and a Nvidia GTS450.

I want to use internal display. No option fails, the external display is perfect.

Can you give a description?

My notebook chipset is: GS45 + ICH9M-E (4500MHD)

PLS HELP ME!

Thx

Anyone? Tip?

See the Troubleshooting Guide->I can't get internal LCD mode to work. If still no go, use Driver sweeper to clear the NVidia driver entries and do a 'clean' reinstall of the NVidia driver. If still no go consider a complete Windows reinstall.

Fun stuff :)

You could simplify getting that rig started by modding in a PCI Reset Delay as documented in http://forum.techinferno.com/diy-e-gpu-projects/4570-%5Bguide%5D-2012-13-rmbp-gtx660-sonnet-echo-express-se-%40-10gbps.html#post63754 to your pci riser cable. That will allow the video card to bypass the BIOS checks.

Link to comment
Share on other sites

You could simplify getting that rig started by modding in a PCI Reset Delay as documented in http://forum.techinferno.com/diy-e-gpu-projects/4570-%5Bguide%5D-2012-13-rmbp-gtx660-sonnet-echo-express-se-%40-10gbps.html#post63754 to your pci riser cable. That will allow the video card to bypass the BIOS checks.

Attempted to do that, but it looks like PCI-e extender cables are not to be twisted. In other words, severed pretty much all the connections at the end joints. They were beginning to deteriorate anyways, but still. I guess it's time to move on from the dock :)

I guess the next step is to read on the PE4L stuff, how it works, what should I pick, etc.

A new laptop probably won't happen anytime soon, as I'm a student at the moment... Plus there's a mountain bike at home that needs some love as well :)

Regardless, thanks for the help, and I'll keep you updated on my findings :)

EDIT:

OK now I'm confused.

Does PE4H support Gen2? From what I've read it's either "supports it with PCIEMM-060B custom cable" or "does not support it".

Asking because I'm still thinking about doing the x2 1.0 setup. I'll just need to figure out how to get that cable out of the chassis... I have that strange feeling that I don't need that external VGA port anymore :D

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.