Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

@coyote

Are you running with physx on? I've experienced terrible drops due to physx.

Yes I do. I'll try again without PhysX or on low.

EDIT: Well, turning PhysX on low solved somehow the FPS drops. They aren't as frequent and strong compared to when it was on low. Thanks again @jacobsson.

Link to comment
Share on other sites

Hi, I'm a newbie here, so please excuse me for dumb questions.

I have a 2012 Mac Mini (quad-core i7 version, 4gb ram) and early 2011 MBP 13" (dual-core i5, 16 gb ram, fusion drive) and would like to build an eGPU to use with both these systems. As far as I understand, the most efficient way to do it now is to buy Firmtek ThunderTek/PX. Is it correct?

For now I don't use Windows (more and more games on Steam now are released for Mac too), but theoretically I may think of installing Windows.

I saw builds proving Thundertek to work with MBP (http://forum.techinferno.com/diy-e-gpu-projects/6062-need-some-help-advice-proceeding-egpu.html), but what about Mac Mini? Did anyone try it? What are expected problems in running eGPU in OS X on Mac Mini? Will it be possible to run it on Windows on Mac Mini?

Link to comment
Share on other sites

@coyote

That is very weird, physx seems to be a problem for many people, first I thought this only applied to 600-series.

I remember testing many different physx-versions with borderlands2 w/o success, so I had to turn it Low.

Link to comment
Share on other sites

I'm planning to try egpu with Dell E6430, and I already have PE4L V2.1B. But I haven't choose VGA. 750 TI is too low performance, and 760 is 256 bit so I'm afraid it could cause a bottleneck. 660 ti, 760, 670(WHAT?!), Which one is better? Or the other options? Thank you in advance.

Link to comment
Share on other sites

Hi guys,

I'm thinking about buying a new Lenovo ThinkPad L540 (i7 4600M, 4+4GB RAM) and I'd like to know whether I understood everything about eGPU setup for it well.

As it is a laptop with ExpressCard and 4th gen. IntelCPU It should be suitable for x1.2opt (NVidia) / x1.2 opt (AMD) solutions.

For that I'll need a power supply (standard one from a PC will work) and a ExpressCard -> PCIExpress adapter. Given only one port (EC) the best option is PE4L 2.1b (PE4H 3.2 would also work but it's just not better in ANY way).

Then, if I'm ok with using the external display only, it should work without any software modifications (other than installing official NVidia / AMD drivers) -> just plug in, boot up, play games. If I want to use internal LCD as well I'll have to use this modified NVIdia drivers. As for AMD, the same can be achieved with Virtu Drivers.

Considering NVidia vs AMD dilemma. NVidia cards should be way faster in DX9 (thanks to Optimus compression) but pretty much the same in DX10/11/?12. The difference will be even smaller (almost unnoticeable) when using an external LCD (which I plan on using). On DX10+ and external display the AMD cards should actually be a bit faster (could anyone estimate by how many percent?).

Considering the 'low bandwidth performance loss', with GTX560 / GTX660 and x1.2opt (or corresponding AMD card and x1.2) I should be able to get circa 70 % of performance (therefore FPS) I'd get on PCI-E 16x desktop PC?

BTW: Could you recommend a good compatible GPU around 200USD that would be fast enough for modern games but wouldn't be limited by the CPU and the eGPU setup (external display only (optimus gain for internal LCD is unimportant))?

Thanks :).

BTW: I'll write a how-to tutorial if I succeed :).

Link to comment
Share on other sites

@coyote

Would you mind manually updating your physx drivers with this?

NVIDIA DRIVERS 9.13.1220

Please post results with physx medium/high from borderlands 2

(of course set physx to GPU in NCP after the update!)

I deleted the old PhysX software, installed the new ones, but I still get FPS drops to 20 FPS at "high" and 40 at "medium", same as before. I think I'll just stick with "low" or "medium".

Link to comment
Share on other sites

1- Fujitsu Lifebook AH532/G21, Intel core i7-3612QM, 6GB RAM, Windows 7 Ultimate 64bit, Intel HD graphics, Nvidia Geforce GT 620M, EVGA GTX660.

i have similar setup with windows 8.1. Do i need to do the setup program to make optimis 1.2 work or is it just plug, install drivers and play ?

Have you tried to plug and see what happens? Post more details please.

Link to comment
Share on other sites

I deleted the old PhysX software, installed the new ones, but I still get FPS drops to 20 FPS at "high" and 40 at "medium", same as before. I think I'll just stick with "low" or "medium".

That too bad! Worth a try at least. I wonder when NVIDIA will un-f*k their physx drivers, I really like the effects i BL2.

Link to comment
Share on other sites

That too bad! Worth a try at least. I wonder when NVIDIA will un-f*k their physx drivers, I really like the effects i BL2.

Yeah, they really need to look after it since it's one of the biggest feature against AMD.

Link to comment
Share on other sites

I have a Lenovo Y500 with 2GB GT650M (currently single card but can be upgraded by adding another 2GB GT650M in SLI using the ultrabay). The iGPU is disabled by Lenovo.

I read the forum but still have a few questions before going forward with the eGPU.

1. I went through the schematics for my laptop and it states that i my WiFi adapter is connected to PCIe Gen 1 2x mostly at 10GT/s but it uses a mPCIe slot. Can anybody confirm this and also tell me what kind of link will i get 1.1, 1.2, 1.1Opt or 1.2Opt or otherwise

post-26519-14494997702574_thumb.png

2. If i use a GTX7xx eGPU with the above configs what will be the approx performance drop compared to 16x desktop counterpart (using external LCD)?

3. What would u recommend me to do... buy an eGPU GTX7xx or go with the lenovo ultrabay with GT650M if i want good gaming performance on FHD? (mainly to allow playable rates for watchdogs :P)

4. Has anyone successfully used the Ultrabay slot for connecting an eGPU? (That slot has full PCIe Gen 3 16x slot)

Link to comment
Share on other sites

So apparently I miss spoke and forgot I had taken out 2GB of RAM. My system works fine with just 2gb (except World of Tanks for some reason) but when I set it up to 4GB, I get error 43 in manager. I've tried various solutions and none have worked. Any input would be great.

Link to comment
Share on other sites

So apparently I miss spoke and forgot I had taken out 2GB of RAM. My system works fine with just 2gb (except World of Tanks for some reason) but when I set it up to 4GB, I get error 43 in manager. I've tried various solutions and none have worked. Any input would be great.

I guess you'll need a DSDT override. What's your OS?

Link to comment
Share on other sites

One more question. I am also confused if I need to purchase a separate monitor for attempting egpu on this laptop. Currently the igpu and dgpu are switchable I don't see Optimus mentioned anywhere about NVidia k1100m chip, but there is PhysX enabled and working.

Would adding an egpu mean I have to disable the k1100m and/or get a separate monitor for better performance. Trying to minimize costs as much as possible.

Thanks.

Link to comment
Share on other sites

Hi guys. Been browsing the forums lately and thinking of setting an egpu up for my HP zbook 17.

i5 4330 @ 2.8 Ghz, Windows 7 64 bit

16 Gb Ram

intel HD 4600 - igpu

NVidia K1100m - dgpu

I am wanting to put a gtx 650 as a egpu into this setup. Currently the igpu and dgpu are switchable.

I have looked at my TOLUD and my machine is good for egpu setup.

Some questions I have.

1. Do I need to disable and uninstall the K1100m dgpu first and then plug in the new egpu. Or do I need to purchase the setup 1.x and do the disable that way?

2. Does every install of an egpu require Setup 1.x to get everything working ?

3. Is it possible to keep all three gpu's working in the machine and would it make any sense to use the K1100m as the PhysX processor? Any ideas on that one?

4. Is it always better to feed the egpu to a separate monitor ?

If anyone can clarify those it would be greatly appreciated.

Thanks

Link to comment
Share on other sites

Have you tried to plug and see what happens? Post more details please.

Got it working without Setup 1.30 software but Opitimus dont work on internal LCD, on external it works and scores nice in 3D Mark 2011 (5500 points).

Installed hardware (with SW1 on "3" because "1" didnt work) and in windows 8.1 it first showed conflict with GT620M. Then I disabled GT620M on device manager, then reboot and GTX660 works fine. Then i engaged GT620M again and no conflicts anymore. Using latest Geforce official drivers. I dedicated PhysX to GT620M.

My laptop has secure boot so setup 1.30 wont work without tuning it.....

- - - Updated - - -

Fujitsu works fine now but i got problem with my Lenovo N200. It has internal Gf 7300GO and Windows 7 but havnt got it working yet using Setup 1.30. Cant disable 7300GO cause windows need it to boot.

And then theres a conflict with 7300GO and GTX660. Cant either disable it in device manager....need help, got any advice ?

Link to comment
Share on other sites

If looking at quad-core systems then recommend the Ivy Bridge series as the best value eGPU systems. In your case that would be a Lenovo T530. IVB uses a more power efficient 22nm chip just like Haswell but at a significantly lower price. Sandy Bridge's less efficient 32nm CPU sees significant 4-core performance TDP throttled lowering the max multiplier giving noticably worse performance. By comparison, Haswell is only marginally faster than Ivy Bridge.

Yeah I agree with that ivy bridge is more reasonable, so I chose Dell E6430 /w 3740QM with 'reasonable' price :) Now VGA remains.. I already have pe4l v2.1b so I'm considering between 660ti and 670. Do you have any other suggestion?

Link to comment
Share on other sites

I have a Lenovo Y500 with 2GB GT650M (currently single card but can be upgraded by adding another 2GB GT650M in SLI using the ultrabay). The iGPU is disabled by Lenovo.

I read the forum but still have a few questions before going forward with the eGPU.

1. I went through the schematics for my laptop and it states that i my WiFi adapter is connected to PCIe Gen 1 2x mostly at 10GT/s but it uses a mPCIe slot. Can anybody confirm this and also tell me what kind of link will i get 1.1, 1.2, 1.1Opt or 1.2Opt or otherwise

[ATTACH=CONFIG]11720[/ATTACH]

2. If i use a GTX7xx eGPU with the above configs what will be the approx performance drop compared to 16x desktop counterpart (using external LCD)?

3. What would u recommend me to do... buy an eGPU GTX7xx or go with the lenovo ultrabay with GT650M if i want good gaming performance on FHD? (mainly to allow playable rates for watchdogs :P)

4. Has anyone successfully used the Ultrabay slot for connecting an eGPU? (That slot has full PCIe Gen 3 16x slot)

Performance q's relative to desktop are answered on first post. Please review. No ultrabay eGPU implemented as yet and AFAIK nobody has created any sort of adapter to make it even possible.

The Y500 schematic has caught my interest. It is almost as if they are suggesting the mPCIe port is a x2 2.0 link. Certainly. mPCIe does make allowance for a second channel but no vendor has used that as yet. PCI Express Mini Card (Mini PCIe) pinout diagram @ pinoutsguide.com . As a first step, I'd suggest running say AIDA and seeing if its reporting any of the southbridge ports as x2. That would be a good sign that perhaps Lenovo have indeed provided a x2 capable mPCIe port. If not, then I'd suggest walking away from further investigation as an x2 electrical link would need to exist, a x2-enablling bios mod made AND x2 mPCIe eGPU adapter manufacturered. That would be time consuming an costly to do. Better to just get a Thunderbolt notebook and call it a day.

Hi guys. Been browsing the forums lately and thinking of setting an egpu up for my HP zbook 17.

i5 4330 @ 2.8 Ghz, Windows 7 64 bit

16 Gb Ram

intel HD 4600 - igpu

NVidia K1100m - dgpu

I am wanting to put a gtx 650 as a egpu into this setup. Currently the igpu and dgpu are switchable.

I have looked at my TOLUD and my machine is good for egpu setup.

Some questions I have.

1. Do I need to disable and uninstall the K1100m dgpu first and then plug in the new egpu. Or do I need to purchase the setup 1.x and do the disable that way?

2. Does every install of an egpu require Setup 1.x to get everything working ?

3. Is it possible to keep all three gpu's working in the machine and would it make any sense to use the K1100m as the PhysX processor? Any ideas on that one?

4. Is it always better to feed the egpu to a separate monitor ?

If anyone can clarify those it would be greatly appreciated.

Thanks

It is not necessary to disable the K1100M. However doing so will get you two advantages. (1) the eGPU can be used to drive the internal LCD via Optimus and (2), x1 pci-e compression will engage on the eGPU which improves mostly DX9 performance. Yes, other users have used a dGPU for physx and eGPU to drive the app/game. Some DX11 apps may see better performance in that configuration than a standalone x1.2Opt eGPU config.

Setup 1.x is a tool that some users need to get the eGPU working. Most eGPU implementations do not require it.

You do get better performance using an external LCD since no pcie bandwidth is needed to display the image on an internal LCD.

Yes, other users have used a dGPU for physx and eGPU to drive the app/game. Some apps may see better performance in that configuration.

Yeah I agree with that ivy bridge is more reasonable, so I chose Dell E6430 /w 3740QM with 'reasonable' price :) Now VGA remains.. I already have pe4l v2.1b so I'm considering between 660ti and 670. Do you have any other suggestion?

I've used both a GTX660Ti, GTX670 and GTX770. My preference is the mini GTX670 made by Asus. Makes for a very compact eGPU implementation with very little real-world eGPU performance degradation over the GTX770. There are mini GTX760 cards made by Asus and MSI too. About 3% slower than the GTX670.

  • Thumbs Up 1
Link to comment
Share on other sites

Hi, firstly just wanted to say thanks for all the great info on this thread!

I have just completed my egpu setup so thought I would share, actually couldn't have been much easier.

Background is I bought a new ultrabook (Asus Taichi) around 6months ago so my old laptop is hooked up to the TV for home theatre and occasional games. My old laptop is a Samsung R620:

Windows 8.1 64bit, Core2duo T6500 (upgraded to T9600), PM45 chipset, 3Gb ram, Integrated 512mb Mobility Radeon HD3300, Expresscard slot

Probably over 4 years old now, I considered replacing but like the idea of making something last longer so started looking at upgrades. First I put in a 1Tb toshiba hybrid HD (for about £45), more storage and also a good performance boost. Second I upgraded the cpu to a T9600 (for about £25). Then I started to look into an egpu… obviously not aiming for anything high end but I figured I should be able to go way beyond the laptops integrated graphics pretty easily and cheaply.

After quite a bit of time trying to figure out a PE4L/H and ATX PSU arrangement, which was coming in a bit more expensive/complicated than I would have liked, I came across the EXP GDC 6.0. As I’m currently living in China this came with power supply, adapter, and power cable for £30!

For the graphics I went with an EVGA GTX550ti 2Gb GDDR5, for about £45. I had read that 2Gb may be helpful on pci bandwith limited arrangements.

In terms of setup I booted up with everything connected and windows found it as a display adapter but showed a driver error in device manager. I right clicked upgrade driver and let it try automatically and it had no problem downloading and installing the right drivers, then showing up in device manager as a GTX550ti. A quick reboot, with TV hooked up to the egpu via hdmi, and the display was straight up! As I said, way easier than I was expecting!

Not yet had chance to try many games to see how it really performs…

A few further notes:

  • Initial reaction is the performance easily goes beyond what it was before but also beyond my i5 ultrabook with intel HD4000. This is perfect as it means I don’t have to load up games on the limited ultrabook SSD and also avoids the hassle of connecting this to the TV.
  • I initially thought there was some conflict as the laptop LCD backlight was on but with no image, I think what was actually happening is it was trying to display to both screens but the 1080p resolution was too high for the laptop to show. If choosing just internal or just external display then everything switches over with no problem.
  • The EXP GDC power supply arrangement is great. Firstly up to 220W from a power brick (albeit a big one). Secondly it only powers up when express card connection is detected, so it auto powers on and off with the computer.
  • I tried for interest only but removing and then reattaching the expresscard with windows up and running doesn’t seem to work, doesn’t re-detect it.
  • I wouldn’t have been able to do a x2 arrangement as my expresscard is slot 3 and my ethernet slot 4 (wifi slot 1 was the only accessible port).

Cheers

  • Thumbs Up 1
Link to comment
Share on other sites

Does the Microsoft Surface Pro 3 have a Thunderbolt connector?

Could this be a breakthrough by Microsoft into TB-eGPU adoption? Microsoft hints at Thunderbolt connector in Surface Pro 3:

A Reddit user by the name sherryoak asked Panos Panay and his team, "where is the thunderbolt link? you could have external gpu's and make it a desktop replacement, i thought the 'ultrabook' standard included thunderbolt." In response, the Surface team posted, "When you buy your Surface Pro3, do me a favor, and take a close look at the 'power connector,'" suggesting that the latest Intel-based Pro 3 might indeed be featuring the latest connector standard for Ultrabooks.

Discussion at the following URL suggests the proprietory power connector is going to attach to a dock using Thunderbolt:

http://www.reddit.com/r/IAmA/comments/26m9cu/we_are_panos_panay_and_the_surface_team_at/chse0d4?context=3

Surface_Pro_3_open_plug.jpg

Above: new 20pin Surface Pro 3 power socket from wpcentral suspected of being a Thunderbolt interface

Link to comment
Share on other sites

Hallo

I already have an eGpu installed on my computer

here is my system spec :

- Vaio VPCZ1 i7 640M

- Hm57 Chipset ( pcie 1.1 )

- Asus GTX 660 with intel HD for optimus

- always using external monitor

here is my problem : After using my eGpu for a past months, I didn't find any problem regarding fps or anything. I use my eGpu for playing hard game like Starcraft ( ultra setting ) or Race Driver Grid ( ultra setting ). Both with 1080p resolution and I got at least 40 fps minimal to more than 100 fps maksimal. And if I activate Vsync I got for both almost constan 60 fps

But, I just installed counter strike global offensive, and I use ultra setting too. But I just got maximal 40, and minimal 20 fps ( even Vsync off, and very low setting )

Is there any problem with my system.? Or it's because GTX 660 bandwidth limited.? Or may be because CS GO's programm stilln't optimized..?

thanks

Link to comment
Share on other sites

I'm running Windows 7 Pro 64 bit. I haven't found how to do a DSDT override.

Ok, so fist thing you'll need to download these two files

Zippyshare.com - iasl-win-20120620.7z (iasl)

http://download.microsoft.com/download/2/c/1/2c16c7e0-96c1-40f5-81fc-3e4bf7b65496/microsoft_asl_compiler-v4-0-0.msi (ASL)

Then follow this video tutorial

.

You'll need the following to be pasted into your DSDT.dsl table:

QWordMemory (ResourceProducer, PosDecode, MinFixed, MaxFixed, Cacheable, ReadWrite,

0x0000000000000000, // Granularity

0x0000000C20000000, // Range Minimum, set it to 48.5GB

0x0000000E0FFFFFFF, // Range Maximum, set it to 56.25GB

0x0000000000000000, // Translation Offset

0x00000001F0000000, // Length calculated by Range Max - Range Min.

,, , AddressRangeMemory, TypeStatic)

Be sure to past it at the end of all the DWords of the search "Device (PCI0)" of the extracted DSDT.dsl table you have.

Once finished just load the table with the asl command.

Follow both this http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#dsdtoverride tutorial and the video link and you'll be done.

  • Thumbs Up 1
Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.