Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

I looked at that a while back, but the same low resolution 1366x768 screen and no high res options killed it for me. I've got that in my X230T and it very frustrating at times :(

although.... if I could order it with a Quadcore Haswell Proc then I would probably go for it. I'm speaking to Fujitsu atm regarding differences between the UK/US models in terms of fan/cooling, power adapter wattage and BIOS compatibility. I would considering purchasing a processor separately and swapping it out - just need to check a few things.

Have you swapped to a quad core in your little HP?

Link to comment
Share on other sites

Speed on Thunderbolt 2 Sonnet Echo Express III-D:

9f7d43313583974.jpg

Maybe i install Windows the next days to give some Benchmarks since the OS X driver aren't as good as the Windows ones.

@gothic860 is the card usable in Mac OS 10.9? Have you connected anything to it? Can the signal be rerouted to a TB display?

Thanks a bunch!

Link to comment
Share on other sites

@gothic860 is the card usable in Mac OS 10.9? Have you connected anything to it? Can the signal be rerouted to a TB display?

Thanks a bunch!

Connected to a GTX780 (non OC or anything) its nearly Plug and Play you only have to edit some .kexts (A Thunderbolt GPU on a Mac : How-to | Le journal du lapin).

I don't think its possible to route it to the Thunderbolt-Display since the eGPU only works on the display which is connected to the GPU.

Link to comment
Share on other sites

Connected to a GTX780 (non OC or anything) its nearly Plug and Play you only have to edit some .kexts (A Thunderbolt GPU on a Mac : How-to | Le journal du lapin).

I don't think its possible to route it to the Thunderbolt-Display since the eGPU only works on the display which is connected to the GPU.

Hello,

Thanks for the reply. Can the card be used to render any opengl game? Also, if i understood correctly the architecture is: laptop - tb - enclosure-external display (hdmi or dvi). Am i correct?

Link to comment
Share on other sites

I haven't installed Windows yet, but Cinebench for example is working in OS X so i think OpenGL is working fine.

Its: Macbook TB2 -> Sonnet Echo -> GPU -> DVI-D -> Monitor, yes.

But i noticed 1 strange thing: I have a TB1 Dock and when i plug the eGPU into this i get about 800-900 MB/s in CUDA-Z (-> ok).

When i use the TB2 port i get about 1,3 - 1,4GB/s (-> ok), but in benchmarks there is no difference.

So maybe the benchmarks don't need the bandwidth or anything isn't working correctly. Its working like a charm in Cinema4D and even Diablo 3 and other games get a huge performance boost but i really don't know why the benchmarks with TB1 and TB2 get the same results.

Link to comment
Share on other sites

I only use OSX so that won't be a problem. The reason why i am asking is that i have a switchable graphics MBP (17'') with TB and i only play under OS X. I was wondering if the e-gpu can be used to render games (you can try playing SC2 which has a "free" version).

Link to comment
Share on other sites

I only use OSX so that won't be a problem. The reason why i am asking is that i have a switchable graphics MBP (17'') with TB and i only play under OS X. I was wondering if the e-gpu can be used to render games (you can try playing SC2 which has a "free" version).

Yes thats possible, but a OWC Helios or a Sonnet Echo Express SE is much cheaper for the TB1 solution (just search the forum).

Switchable GPU isn't a problem in OSX its working without problems here.

Link to comment
Share on other sites

My Macbook has an AMD 6750 discrete card and an intel gpu. In your case (i see you have a 750m) can you use gfxcardstatus to toggle the e-gpu as primary video card?

Edit: i was considering a ViDock 4 Plus Overdrive (Two 6-Pin/320W/329mm) since i already have the Express Card slot in my MBP (but i could give a shot to a TB enclosure although as far as i understood the performance difference is not that big). I was simply wondering if already having a discrete GPU won't be a pain to get over (and require a windows installation + disabling the AMD card). Also, i am currently in "reading" mode before making a buying decision (i will however post here before i make up my mind in case there are some unanswered questions).

Link to comment
Share on other sites

My Macbook has an AMD 6750 discrete card and an intel gpu. In your case (i see you have a 750m) can you use gfxcardstatus to toggle the e-gpu as primary video card?

Edit: i was considering a ViDock 4 Plus Overdrive (Two 6-Pin/320W/329mm) since i already have the Express Card slot in my MBP (but i could give a shot to a TB enclosure although as far as i understood the performance difference is not that big). I was simply wondering if already having a discrete GPU won't be a pain to get over (and require a windows installation + disabling the AMD card). Also, i am currently in "reading" mode before making a buying decision (i will however post here before i make up my mind in case there are some unanswered questions).

It would be a serious performance handicap to use a Expresscard eGPU adapter with your 6750 equipped MBP. The reason being you don't have an iGPU present in Windows and so would not get the x1 pci-e compression. It's the x1 pci-e compression that accelerates mainly DX9 apps/games.

Besides, why by the V*Dock when you can get double the bandwidth in a native TB enclosure instead for a lower price? See http://forum.techinferno.com/diy-e-gpu-projects/5793-cheapest-pcie-thunderbolt.html

Link to comment
Share on other sites

I did five 3DMark Vantage tests. I still believe that port 2 isn't working for x2. Because I can set port 1 to x2 with port 2 disconnected and have the same 3DMark Vantage test scores as if I had both ports connected. I made another video explaining in more detail what I mean: https://www.youtube.com/watch?v=ODHr8OSpr90&feature=youtu.be

score: 10439 port 1 only at x2 1 monitor

score: 10062 port 1 & port 2 at x2 1 monitor

eGPU+iGPU compression:

score: 10280 port 1 & port 2 at x1 1 monitor

score: 10203 port 1 only at x1 1 monitor

score: 11389 port 1 only at x1 2 monitor

Its cool I could get about 1000 scoring through x1 but I want x2 to work right. ><

Link to comment
Share on other sites

I did five 3DMark Vantage tests. I still believe that port 2 isn't working for x2. Because I can set port 1 to x2 with port 2 disconnected and have the same 3DMark Vantage test scores as if I had both ports connected. I made another video explaining in more detail what I mean: https://www.youtube.com/watch?v=ODHr8OSpr90&feature=youtu.be

score: 10439 port 1 only at x2 1 monitor

score: 10062 port 1 & port 2 at x2 1 monitor

eGPU+iGPU compression:

score: 10280 port 1 & port 2 at x1 1 monitor

score: 10203 port 1 only at x1 1 monitor

score: 11389 port 1 only at x1 2 monitor

Its cool I could get about 1000 scoring through x1 but I want x2 to work right. ><

The x1 pci-e compression can skew the results in 3dmark benchmarks such that x1 may give better results than x2. Most visible in 3mark06 (DX9) but also in vantage (DX10). The best way to check the pci-e link is run GPU-Z. It will report either x1 1.0 or x2 1.0.

The best way to see the performance difference between x1.Opt and x2 is to run a DX11 game like DirtII/DirtIII which has an inbuilt benchmark that reports minimal and average FPS. x2 will definitely show better average FPS on your system. For other DX9/DX10 games you'll need to do your own tests to see if x1.Opt is faster than x2.

Yes, you can set port1 to x2 and remove the second lane. Setup 1.30 will report p1@x2.1 [PCI ID eGPU]@x1.1 if using only a single lane or p1@x2.1 [PCI ID eGPU]@x2.1 if using two lanes. If port1 is switched to x1 then it will only ever report p1@x1.1 [PCI ID eGPU]@x1.1

Note: DIY eGPU Setup 1.30 performs PCI compaction. I gighlight this as in your youtube vids you are using the term 'compressing' which may confuse the issue. I'm referring to x1 eGPU pcie-compression that is activated by the NVidia driver if there is an active Intel 4500MHD or newer iGPU, a NVidia Fermi or newer eGPU being used and no nVidia dGPU active.

Link to comment
Share on other sites

Incoming wall of text, sorry about that :)

Maybe i did not give all the details of my setup. Existing hardware:

1. MBP 17'' with switchable graphics (Intel HD 3000 and AMD 7650m) with MacOS 10.9 with latest patches

2. Thunderbolt display

What i would like to achieve:

1. Video card upgrade using e-gpu without investment in a new display (means i would really love to reuse my existing display).

2. An external enclosure which will hold an NVidia card (GTX760 or above) without having an external PSU (and as far as i understood the ViDock has a 200W PSU which would be enough for the GTX760 which is rated at 170W)

Based on the research so far the TB display works on Windows ( AnandTech | A First Look at Thunderbolt on Windows with MSI's Z77A-GD80 ) and has an "usb" audio card. Because the laptop has an Intel GPU then Optimus and compression would work as well. What would not work under Win EFI boot is the on-board sound card (which is really not a problem). If i could completely bypass Windows and make the whole thing work under Mac OS it would be perfect.

Based on the above what would be your recommendations? Can i achieve what i want without changing the laptop or the display?

Thanks a lot!

Edit: the video card can also be an AMD one.

Link to comment
Share on other sites

Because the laptop has an Intel GPU then Optimus and compression would work as well. What would not work under Win EFI boot is the on-board sound card (which is really not a problem). If i could completely bypass Windows and make the whole thing work under Mac OS it would be perfect.

The Intel HD3000 can only be enabled to be functional under MacOSX. AFAIK, nobody has managed to get it working under Window. Which means at least the internal LCD mode provided by NVidia Optimus or LucidLogix Virtu (AMD) will not function. Nor will the x1 pci-e compression engage if using Win7. If using Win8 it *might* since Win8 sees additional eGPUs having no reliance on the primary display being active. Win7 must have the primary working for additional ones to be able to activate.

The lack of Intel HD3000 in Windows is the reason I recommend you pursue a native 10Gbps Thunderbolt solution instead of an Expresscard based one. 10Gbps will give you double the bandwidth to play with.

REF: http://forum.techinferno.com/diy-e-gpu-projects/3062-%5Bguide%5D-2012-13-mbp-gtx660ti-hd7870%40x2-2-th05.html#post42483 for a performance comparison of x1.2Opt versus x2 2.0 (slightly slower than 10Gbps). Included is a comparison of x1 2.0 versus x2 2.0 using an AMD card.

Link to comment
Share on other sites

Okay, so after some delay, I finally decided to break out my old T60p ...

CyT7Zve.jpg

... while I decide on which laptop to buy. Just ordered the PE4L!

Just to clarify, the PCIe x1 slot works with the PCIe x16 connector on my 750 Ti?

Link to comment
Share on other sites

Okay, so after some delay, I finally decided to break out my old T60p ...

... while I decide on which laptop to buy. Just ordered the PE4L!

Just to clarify, the PCIe x1 slot works with the PCIe x16 connector on my 750 Ti?

Hi, I'm really interested in how you managed to take apart your laptop in order to plug in your monitor. I looked around on the net but didn't find anything that would have brought me any close to what you've done. Good job BTW.

Link to comment
Share on other sites

Hi, I'm really interested in how you managed to take apart your laptop in order to plug in your monitor. I looked around on the net but didn't find anything that would have brought me any close to what you've done. Good job BTW.

Um ... I just unscrewed everything. It wasn't a very neat job, and I broke a few plastic clips in the process. Fortunately, the T60p is fairly easy to open up, although it's still hard to disassemble entirely. As for the built-in display, you need to get at the hinges on either side. Sorry if it's not much help ... I didn't really have much of a game plan in doing this and it wasn't very cleanly done.

And sorry if the image I posted breaks a rule or something. I noticed that it doesn't show up anymore.

EDIT: I forgot, when removing the display, make sure you're careful not to damage the WLAN antennae, which can be fragile.

  • Thumbs Up 1
Link to comment
Share on other sites

Hi, I'm really interested in how you managed to take apart your laptop in order to plug in your monitor?

Please tell me why you'd need to take apart your laptop in order to use an external monitor in the VGA port? I laughed at first then I realize I'm probably stupid.

Skickas från min iPhone via Tapatalk

Link to comment
Share on other sites

Please tell me why you'd need to take apart your laptop in order to use an external monitor in the VGA port? I laughed at first then I realize I'm probably stupid.

Oh actually I have an old laptop that worked very nicely but then the screen broke. The laptop can turn and the screen would stay black. So I wanted to figure out a way to repair it without having to change the screen itself which would be pretty costly.

Link to comment
Share on other sites

Oh actually I have an old laptop that worked very nicely but then the screen broke. The laptop can turn and the screen would stay black. So I wanted to figure out a way to repair it without having to change the screen itself which would be pretty costly.

Oooh, I get it now!

First I thought why the he*l would one need to disassemble the entire laptop in order to use the VGA-port =)

Good job!

Link to comment
Share on other sites

I can kiss you! I finally see that x2 is being engaged in windows with the software GPU-Z and most of all I saw that in the game Dirt 2 demo benchmark test x2 works better then x1.Opt

Dirt 2 in-game benchmark tests

x1.Opt

Average FPS: 61.9

Minimum FPS: 27.4

x2

Average FPS: 51.1

Minimum FPS: 31.0

I could even see the FPS minimum spike a lot more at x1.Opt running the benchmark test.

I did some more 3DMark benchmark tests. So I might as well post them even tho it doesn't really show the difference of x1.Opt and 2x on my Lenovo T410, Intel Core i7-620, Windows 7 64bit, ECS Nvidia GTX 560.

14947 DX9 x1.Opt

10715 DX10 x1.Opt

P3321 DX11 x1.Opt

9251 DX9 x2

10439 DX10 x2

P3022 DX11 x2

Nvidia unfortunately doesn't enable pci-e compression on x2 or greater links

So does that mean AMD/Radeon video cards do enable pci-e compression on x2 or greater links? If this is true. I feel like hitting myself for getting a Nvidia GPU

The x1 pci-e compression can skew the results in 3dmark benchmarks

Yea I did some more 3dmark benchmark tests and saw that it was skewing the score

your youtube vids you are using the term 'compressing' which may confuse the issue.

Yea I had a brain fart. :dejection:

P.S. Thank you for all you're help! I'll be sure to email you asking more questions to make videos on youtube that explain the DIY eGPU with setup 1.x. Because no one wants to read through 500 pages of forums and still not get it. :nightmare: And most of all the youtube video you have on this forum on page one is completely and utterly useless. :Banane06:

Link to comment
Share on other sites

I can kiss you! I finally see that x2 is being engaged in windows with the software GPU-Z and most of all I saw that in the game Dirt 2 demo benchmark test x2 works better then x1.Opt

Dirt 2 in-game benchmark tests

x1.Opt

Average FPS: 61.9

Minimum FPS: 27.4

x2

Average FPS: 51.1

Minimum FPS: 31.0

I could even see the FPS minimum spike a lot more at x1.Opt running the benchmark test.

I did some more 3DMark benchmark tests. So I might as well post them even tho it doesn't really show the difference of x1.Opt and 2x on my Lenovo T410, Intel Core i7-620, Windows 7 64bit, ECS Nvidia GTX 560.

14947 DX9 x1.Opt

10715 DX10 x1.Opt

P3321 DX11 x1.Opt

9251 DX9 x2

10439 DX10 x2

P3022 DX11 x2

So does that mean AMD/Radeon video cards do enable pci-e compression on x2 or greater links? If this is true. I feel like hitting myself for getting a Nvidia GPU

You can use x1 2.0 (same as x2 1.0) and x1.1Opt (same as x1.Opt) results published at http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660@[email protected] to compare yours against. I'd recommend you consider doing a T410 implementation guide for others to be able to duplicate your x1.Opt and x2 1.0 results. Such a thread would serve as a better area for performance related discussion of your configuration.

FYI: AMD does no pci-e compression. It's a NVidia only feature, hence why there is way more NVidia eGPU implementations.

Link to comment
Share on other sites

Oh actually I have an old laptop that worked very nicely but then the screen broke. The laptop can turn and the screen would stay black. So I wanted to figure out a way to repair it without having to change the screen itself which would be pretty costly.

Yup, same thing happened to me. After using it with an external monitor and a dead built-in display, I finally decided to remove the built-in display, before I bought a new laptop.

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.