Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

Buddy I meant if Intel will implement the MXM gpu support on the NUC platform, that will again shake the computing market. I know he is only using the movie slot of the NUC using a riser.

@Tech Inferno Fan have you seen this breakthrough?

Sent from my iPad using Tapatalk

Yes, I've started a shadow thread about it: http://forum.techinferno.com/provisional-guides/10368-intel-nuc5i5ryk-gtx970%4016gbps-m-2-p4sm2-win8-%5Bwormholev2%5D.html

Link to comment
Share on other sites

Is this guy a member here too? I think he is or maybe he's just over at hardforum.com

Sent from my iPad using Tapatalk

Not finding anything under his name or similar. It's likely though that's he's been on here even as a guest. I say that because it seems he's implemented the w4vz PCIe splitter to utilize his Dell DA-2 for power. http://forum.techinferno.com/enclosures-adapters/9426-220w-dell-da-2-ac-adapter-discussion.html#post122731

Link to comment
Share on other sites

Buddy I meant if Intel will implement the MXM gpu support on the NUC platform, that will again shake the computing market. I know he is only using the movie slot of the NUC using a riser.

@Tech Inferno Fan have you seen this breakthrough?

Sent from my iPad using Tapatalk

Won't happen. Especially if Intel will officially push eGPU through Thunderbolt why implement expensive MXM gpus in their NUC? Exceeds temperature target of the NUC as well. Definitely not going to happen if you ask me.

Hope I understood you this time. Sorry for the previous misunderstanding.

Link to comment
Share on other sites

Is this-that someone knows a way to increase the ventilation of a mac?

Add a big fan to vacuum the hot area

Or use an aluminum block with a fan, and stick it to the processor level (Peltier)

Even with my EGPU Galax 970 in macbook pro retina 13" I go up to temperature 90 ° C (194 ° F) for the CPU :Banane54:

Link to comment
Share on other sites

Hi guys,

I just got my PE4C v3.0 today.

Plugged my Zotac GTX670, on my Thinkpad T410s(win 7 pro 64bit) with AD-2 power supply using express card.

And it all worked like a charm!

However when i checked the bandwidth throught gpuz it shows PCI-Express x1 v1,1..

My processor is i5-520m. So i supposed i can get at least x1 v1,1Opt..i really need that compression things.Lol

How do i enable optimus? Or it is impossible with my current processor? I used latest nvidia desktop driver (v353.30)

Sorry im new here. Hope you can help with this,

Just run 3dmark06 Fire Strike and get like only 2357.

while 3dmark11 only 3150. I suppose can get way higher than this.

By the way, i already disabled the hdmi sound to clear more bandwidth.

Thanks in advance.

Regards,

Eki

  • Thumbs Up 1
Link to comment
Share on other sites

i use eGPU 3 months ago and i know that eGPU for mPCIE/EC can't use full 100% power GPU, bandwith limited and i have question:

1. You know laptop have a pcie 2.0 or 3.0 have 2, 4, 8 or 16 lanes??

2. why exp gdc, pe4c, pe4l have16 lanes while laptop have limited bandwith???

3. i use GTX 970 w\in EXP GDC beast and plug in mpcie 2.0 x1, intel hm67, is that about more than 80% this card??

4. In topic http://forum.techinferno.com/implementation-guides-pc/7388-15-lenovo-w540-r9_290x-gtx780ti%4010gbps-4gbps-sonnet-ee-se2-pe4l-2-1b-%5Bgothic860%5D.html#post101232 gothic860 have a high benchmark score with EC2, is that he can use full power GPU??

5. http://forum.notebookreview.com/threads/diy-egpu-experiences.418851/ in FAQ 13.5 what do he means that PCIe compression engages gaining anywhere from 20-333% better performance over a x1 2.0 link without compression.

Link to comment
Share on other sites

i use eGPU 3 months ago and i know that eGPU for mPCIE/EC can't use full 100% power GPU, bandwith limited and i have question:

1. You know laptop have a pcie 2.0 or 3.0 have 2, 4, 8 or 16 lanes??

2. why exp gdc, pe4c, pe4l have16 lanes while laptop have limited bandwith???

3. i use GTX 970 w\in EXP GDC beast and plug in mpcie 2.0 x1, intel hm67, is that about more than 80% this card??

4. In topic http://forum.techinferno.com/implementation-guides-pc/7388-15-lenovo-w540-r9_290x-gtx780ti%4010gbps-4gbps-sonnet-ee-se2-pe4l-2-1b-%5Bgothic860%5D.html#post101232 gothic860 have a high benchmark score with EC2, is that he can use full power GPU??

5. DIY eGPU experiences | NotebookReview in FAQ 13.5 what do he means that PCIe compression engages gaining anywhere from 20-333% better performance over a x1 2.0 link without compression.

1. No PCIe 3.0 notebooks so far. For more lanes you'd need more mPCIe slots or M.2/NGFF. M.2/NGFF slot delivers x2 2.0, so that would be a better choice because it's simpler than using 2 slots. Look for this if you want a new notebook.

2. They don't have 16 lanes. The slot you put the graphics card in is a x16 slot so you have a stable slot for your graphics card. The slot is electrical only connected via x1.

3. You'll have around 70-80% of the desktop performance of the card.

4. No. The only way to use the full power of a eGPU is by connecting it with all lanes. That's not possible for now with DIY eGPUs. We're limited to x4 2.0 with Thunderbolt 2.

5. It's meaning the Nvidia Optimus compression. Optimus enables a PCIe compression when using a x1 connection. So using a x2 2.0 with M.2 would already disable this compression but you have double the bandwidth compared to mPCIe or EC.

Link to comment
Share on other sites

1. No PCIe 3.0 notebooks so far. For more lanes you'd need more mPCIe slots or M.2/NGFF. M.2/NGFF slot delivers x2 2.0, so that would be a better choice because it's simpler than using 2 slots. Look for this if you want a new notebook.

2. They don't have 16 lanes. The slot you put the graphics card in is a x16 slot so you have a stable slot for your graphics card. The slot is electrical only connected via x1.

3. You'll have around 70-80% of the desktop performance of the card.

4. No. The only way to use the full power of a eGPU is by connecting it with all lanes. That's not possible for now with DIY eGPUs. We're limited to x4 2.0 with Thunderbolt 2.

5. It's meaning the Nvidia Optimus compression. Optimus enables a PCIe compression when using a x1 connection. So using a x2 2.0 with M.2 would already disable this compression but you have double the bandwidth compared to mPCIe or EC.

i think at least can use 80%, 70& is that waste of money to buy eGPU??? if i buy new laptop for new intel chipset, have M.2 slot can have more performance???

Link to comment
Share on other sites

i think at least can use 80%, 70& is that waste of money to buy eGPU??? if i buy new laptop for new intel chipset, have M.2 slot can have more performance???

Why do you ask if you know it better? You have to decide if it's a waste of money. GTX 970 will still have more power than a GTX 960 but also will still cost more.

Yes, M.2 delivers double the bandwidth and thus will have a better performance. I just wrote it in my previous post...

Link to comment
Share on other sites

Why do you ask if you know it better? You have to decide if it's a waste of money. GTX 970 will still have more power than a GTX 960 but also will still cost more.

Yes, M.2 delivers double the bandwidth and thus will have a better performance. I just wrote it in my previous post...

Before i bought eGPU i dont know much but it doesn't matter, i think i should satisfield with 80% GPU PC and cheaper than PC

Link to comment
Share on other sites

Hi guys. So here is another update regarding Lenovo E530 + GTX 660:

I managed to do the DSDT override since my TOLUD was 3.49GB and no "large memory" were available.

So the huge pain in the ass is to have my eGPU overcome "code 12". I've tried almost every different PCI compaction method but still no luck.

Let's say that my PCI compaction were successful, can I verify that by looking in setup 1.x using NVflash or something else?

My eGPU never shows up in NVflash and that seems shady.

I would really appreciate some help here!

jacobsson/Tech Inferno Fan - Do you mind if I ask if you ever found a solution to your E530 setup? I'm attempting the same now, but I'm having the same trouble as you. If i try to boot with the eGPU connected, my laptop hangs on a black screen. Setup 1.3 doesn't recognize it no matter what I try, and the only way I seem to be able to get my laptop to recognize the eGPU is to hot plug when windows is sleeping, but then I get an error 12, despite performing a DSDT patch already. When I try a PCI compaction in Setup 1.3 (without the eGPU detected remember), it always hangs. Any tips?

Link to comment
Share on other sites

jacobsson/Tech Inferno Fan - Do you mind if I ask if you ever found a solution to your E530 setup? I'm attempting the same now, but I'm having the same trouble as you. If i try to boot with the eGPU connected, my laptop hangs on a black screen. Setup 1.3 doesn't recognize it no matter what I try, and the only way I seem to be able to get my laptop to recognize the eGPU is to hot plug when windows is sleeping, but then I get an error 12, despite performing a DSDT patch already. When I try a PCI compaction in Setup 1.3 (without the eGPU detected remember), it always hangs. Any tips?

jacobsson aborted the E530 eGPU instead going for a more convenient TB/expresscard configuration.

If you need to do the sleep/resume trick after booting with wifi card, then you have a whitelisting issue. If you can do the sleep/resume trick without booting wifi then you may have a CLKRUN issue. PE4C V3.0 has a CLKREQ switch that starts the clock after some delay. Needed in some cases where the eGPU otherwise isn't detected if hotplugged after boot.

Link to comment
Share on other sites

my PE4C egpu just arrived from amazon I noticed that the battery is hanging out. Im not sure if this is how its built but I feel like its missing a solder or something. Cant test right now as I am still waiting for my GPU. appreciate the help thanks

Imgur

Link to comment
Share on other sites

If you need to do the sleep/resume trick after booting with wifi card, then you have a whitelisting issue.

Thanks for your reply.I finally managed to get the eGPU detected in Setup 1.3 and in the desktop thanks to hotplugging - however I still have the error 12. I've tried all your solutions listed here: http://forum.techinferno.com/diy-e-gpu-projects/2129-diy-egpu-troubleshooting-faq.html#error12_faq1 , but the only one that works for me is to limit my RAM to under 4gb.

I was hoping that now I got my eGPU recognized in Setup 1.3, that I can now try PCI compaction to solve my issue. But the only compaction scenario that works is the last one (I want to say END 64bit? I forget the exact name :P), however at the very end where the program is refreshing the status window I always get an "error from reading from Drive C" error, with the options to Abort, Ignore, Retry, or Fail - but no matter what I pick I just get the same error over and over again. I've tried limiting the scope, but that doesn't help. Any thoughts and a way to fix this, or another way to fix the error 12 that's not listed on the link above? I've even tried a DSDT fix and everything worked smoothly (I see the large memory allocation in Device Manager), but in Setup 1.3 I see "DSDT: no".

Link to comment
Share on other sites

Could someone please help me out with my power supply issue - I have the bplus 2.1a PE4L adapter and I have it working with 750 TI just fine. However, I don't understand how to get my GTX 680 to work.

Here is what I currently have egpu - Album on Imgur That is how I normally run my 750 TI, but this time, if I put the 20+4 pin rail into the adapter, the card isn't recognized and the fan is spinning like it's at 100%. I've tried emulating something like this http://forum.techinferno.com/attachments/diy-e-gpu-projects/14970-58l5rfw.jpg but with little luck. Each time, the card sounds like it's about to go off the rails if I connect the 24 pin into it. And I'm afraid of shorting something.

Any help? =/

edit: the PSU uses 22a on +12v and its max load is 400w

It's brand new - I have another 350w I can also try but it's pretty crappy -always on- one. I did try waking it up from sleep, I've also tried using the delay switches - nothing helps. From sleep it simply restarts my pc, and if I have the express card in instead of hotplugging, the pc won't start.

edit2: well just for kicks I dusted off my 2540p and tried it with that.. with delay switch set to 3, I was able to get it going. http://imgur.com/a/QMfFV So that's cool - so something must be off with my 2570p? Maybe it's the gen2 thing that bplus doesn't like? But my 750 TI is set to gen2 and no problems there.

Hm.

edit3: resolved? I set both of the delay switches on bplus to 3 (usually this fucked stuff up with my TI), and this time the PC booted up right. Everything is systems go, running Witcher 2 maxed out at stable 60 fps weeeee.

Link to comment
Share on other sites

Is there such an adapter? Could I make one? Lets say I were to use my mini pci e slot and expresscard slot, then use a pci e x2 to x16 adapter on my Elitebook 8740w, would I get to use 4gbps? I want to know cause I heard I can only get maximum 2GBPS bandwidth on my laptop, because it has an i7 640m and an iGPU thats not connected to the LCD. I really want to use 4GBPS since thats the sweetspot and I've been saving up for 5 months now..Thanks for the read

Link to comment
Share on other sites

Hi I'm new. Sorry if this has been answered. Just so confused with all the information. This is what I have

sager laptop:

I7 4810 2.5-3.5ghz

16gb ram

870m 6gb

1tb hdd

Is this compatible with an egpu setup. If so what do I need? <strike>All the setups I see are either older or MBP. </strike> I would like to use a 980ti if possible. Also how does this work with the new win 10 coming out? What issues are people currently having and what kind of performance damper will I be looking at? Is this really as simple as plug and play or is there some mod tweaking in the process.

Link to comment
Share on other sites

Hello. I'm using a Lenovo laptop. I've been successful in implementing my egpu setup using the express card and pe4c. Now my question is this : are there any optimizations and increasing the band with? Could I use the x2 port and route that to the mpcie for extra bandwidth?

Link to comment
Share on other sites

Hello, I have a problem with egpu. If I do not install Nvidia driver ( on Win 8.1 64 bit) , I can see egpu on external Monitor. After that screen goes dark. What's the problem ?

My Adapter is EXP Graphics Display Card V7

Link to comment
Share on other sites

Won't happen. Especially if Intel will officially push eGPU through Thunderbolt why implement expensive MXM gpus in their NUC? Exceeds temperature target of the NUC as well. Definitely not going to happen if you ask me.

Hope I understood you this time. Sorry for the previous misunderstanding.

it did happen once with this;

IMG_9188-640x280.jpg

here's the link to its mini review; Fast, but compromised: Gigabyte

Link to comment
Share on other sites

it did happen once with this;

here's the link to its mini review; Fast, but compromised: Gigabyte

Never heard of it before. Besides that, showing me this thing which is only mediocre in performance isn't proving wrong what I wrote. I didn't write that it's not possible to make, I wrote that it won't happen in terms of manufacturers or even Intel themselves pushing those NUC boxes with MXM graphics cards. There's no reason to. There will be enough of small form factor devices coming up if SteamOS and the "Steam Boxes" will be pushed on the market.

Apart from this, with the hopefully extended distribution of Thunderbolt(usual or USB-C plug) and official support of eGPUs, take a mobile quadcore Intel processor like what we've got in the Macbook Pro 15", HP ZBook, Thinkpad W540/541 etc, implement a sufficient cooling(maybe AiO water cooling) that doesn't sound like a departing airplane, at least the possibility for 8 GB RAM, M.2 slot for system storage, integrated wifi, LAN and give it, as the important part, a Thunderbolt slot in whichever form with eGPU support.

That's it. Put that in such a NUC box. No more need for crappy overpriced mobile GPUs if you can plug your desktop GPUs into such boxes. The Intel mobile quads are more than capable to compete with desktop Intel CPUs.

Link to comment
Share on other sites

I would like to say thanks for your awesome work.

I had a DiY eGPU years ago (when I was using an old core 2 duo laptop + GTS 450 gpu and the thread was still at notebookreview) using the PE4L-EC2C. While it was great for a while, the intel Core 2 duo I have is aging and I only have PCIe 1.0 @ x1.0

I had migrated to desktop computers for a while due to performance \ price (I have Core i7 5820k, 16GB RAM, GTX 970 with custom loop and 144Hz 1080p monitor). I was recently looking at some laptops and the Asus G751JM caught my attention due to its price and performance.

I needed a laptop when I am traveling and the GTX 860M doesn't look like a strong enough GPU to push consistent 100FPS at 1080p even on medium~high setting on games I will play (like GTA 5, Grid Autosport, F1 2015, Dota 2, etc...).

It has a Core i7 4710HQ, 8GB RAM, GTX 860M (maxwell and according to google it is rougly the same performance as GTX 750 Ti) and more importantly a 1080p IPS display that is overclockable to 100Hz and a thunderbolt port. I immediately pulled the trigger to buy it now and it is still not shipped out for some reason (bought the laptop from here) :neglected:.

I have a few questions if you don't mind about eGPU.

First of all I heard that there was a new version of thunderbolt, how can I tell which version of thunderbolt is on my laptop? I tried googling it but there is no conclusive answer as to what version is on Asus G751. According to some guy the Thunderbolt host chip in Asus G751 is BDW-TBT-LP (WR) LINK HERE

I am also wondering what is the cheapest available thunderbolt adapter so I can plug in a desktop graphics card (something like a GTX 970)? If you can tell which parts to get (like what cable, board, graphics card, psu) that would be great. The enclosure that is available are outrageously expensive (it cost like $600 which is the same amount I paid for the Laptop).

This might be off topic but there was a leaked driver that enabled G-Sync on some laptops (mainly laptops that uses eDP monitor interface and has a panel LG LGD046C and no NVIDIA Optimus according to this guy), What is your thoughts about it?

Also is there a way to permanently disable the integrated intel graphics on an NVIDIA Optimus enabled laptop (like the Alienware M17x r3 that has a BIOS option to select either iGPU or dGPU) if there is no setting in the BIOS? I have the same eDP panel on my laptop but NVIDIA Optimus according to reports prevent the G-Sync hack from working.

Last but not the least, with the arrival of HBM (High Bandwidth memory), do you think a thunderbolt port would provide enough bandwidth given the massive increase on bandwidth on HBM gpu's (NVIDIA is claiming 3x higher bandwidth with HBM on NVIDIA Pascal LINK HERE)?

Link to comment
Share on other sites

Never heard of it before. Besides that, showing me this thing which is only mediocre in performance isn't proving wrong what I wrote. I didn't write that it's not possible to make, I wrote that it won't happen in terms of manufacturers or even Intel themselves pushing those NUC boxes with MXM graphics cards. There's no reason to. There will be enough of small form factor devices coming up if SteamOS and the "Steam Boxes" will be pushed on the market.

Apart from this, with the hopefully extended distribution of Thunderbolt(usual or USB-C plug) and official support of eGPUs, take a mobile quadcore Intel processor like what we've got in the Macbook Pro 15", HP ZBook, Thinkpad W540/541 etc, implement a sufficient cooling(maybe AiO water cooling) that doesn't sound like a departing airplane, at least the possibility for 8 GB RAM, M.2 slot for system storage, integrated wifi, LAN and give it, as the important part, a Thunderbolt slot in whichever form with eGPU support.

That's it. Put that in such a NUC box. No more need for crappy overpriced mobile GPUs if you can plug your desktop GPUs into such boxes. The Intel mobile quads are more than capable to compete with desktop Intel CPUs.

Yes I agree with all your point buddy, hell I don't want to spend $500 for an MXM based GPU, I'd get a GTX 980.

Thunderbolt 3 ports on all devices should be necessary, all devices, not just those fruity brands and hellish expensive laptops.

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.