Jump to content

15" Dell XPS L502x + GTX650Ti@2Gbps+c-mPCIe2 (PE4H 2.4a) + Win7 [daver160]


Recommended Posts

UPDATES:

Feb 3, 2013: Added 3DMark06 and 3DMark11 benchmarks for dGPU GT 540M as comparative values between dGPU and eGPU

Feb 17, 2013: Added pictures for mPCI-e slot underneath my laptop, and eGPU enclosure

May 14, 2013: Selling my PE4H to upgrade to PE4L!

First and foremost I want to thank Nando for all his troubleshooting and support as I worked to get my eGPU working. Without his help I wouldn't be up and running, especially with some of the interesting troubles I ran into.

I hope this thread will help anybody else with a Dell XPS 15 L502x, or a similar machine, who wants to get an eGPU going. Please note that the following instructions are what got my eGPU working; it's entirely possible that following my steps may not work for you as well. You may have to make your own adjustments based on your own configuration.

It should also be noted that I am only running 1.1Opt, as the GPU-z screenshot will confirm. I am working with HIT to exchange my current PM3N for the newer Gen2 capable mPCIe board. The PE4H is indeed PCI-e 2.0 capable, but the mPCI-e card itself is not. When I have the new mPCI-e card and cable, I'll run the benchmarks again for comparison.

Benchmark scores and photos to follow.

In this Post:

Requirements

Installing eGPU

Using Setup 1.x

Things you will need:

  • Free mini PCI-e slot (not half size!)
  • PCIe -> mPCIe adapter (e.g. PE4H / PE4L + the mPCIe adapter board and cable)
  • GPU of your choosing (at this time of writing, Nvidia card is required if you want to get internal LCD support)
  • PSU that meets the minimum power requirements of the GPU (something that can provide >150W on 12V1 rail should be enough)

My current configuration is as follows:

Dell XPS 15 L502x:

  • Intel i7-2760qm
  • 8GB 1333MHz
  • Intel HD 3000
  • Nvidia GT 540M
  • Windows 7 Ultimate 64-bit
  • Setup 1.x (v1.10 and above)

eGPU components:

  • PE4H 2.4 (from HIT)
  • PM3N (from HIT)
  • mini HDMI , 60cm (from HIT)
  • Setup 1.x (v1.10b5) (from HIT)
  • Gigabyte GTX 650 Ti 2GB
  • Coolermaster Extreme 2 Power Plus 625W PSU
  • A shoebox (of all things!)A re-purposed Shuttle XPC case to hold the PSU + eGPU

1. Preparing the machine for eGPU

Installing Setup 1.x

  1. Either purchase/donate for Setup 1.x from HIT, or follow the link they send you in an email after purchasing your eGPU kit
  2. Extract the installer's content to anywhere on your machine (default is C:\eGPU), and run the "eGPU-Setup-mount.bat" batch file to install Setup 1.x. This will add an item to your machine's bootable OS

Uninstall mobile Nvidia drivers

  1. Uninstall all Nvidia driver items. It's easiest if you uninstall "NVidia Graphics Driver $drivernumber$" first, as it automatically uninstalls other items with it
  2. Navigate to your "C:\Program Files" folder (and if on 64-bit OS, "C:\Program Files (x86)") and delete the "Nvidia Corporation" folder

2. Installing and setting up eGPU components

Installing the eGPU

  1. Shut down your computer
  2. Plug mPCIe adapter board into laptop; make sure you screw it in to secure it in place
  3. plug GPU into PE4H/PE4L board; the PCI port on the board is a little "loose", this is normal so don't worry if the GPU wobbles a bit
  4. Plug the mini HDMI cable into the PE4H board and the mPCIe adapter
  5. Plug the 6-pin PCI power cable from the PSU into the GPU
  6. Plug the 24-pin ATX power cable fro mthe PSU into the SWEX board
  7. Plug the 4-pin floppy connector into the port on the PE4H/PE4L board. If you do not have a 4-pin floppy adapter, then use the molex-to-floppy adapter cable that is provided with the PE4H kit.
  8. Connect your eGPU to an external monitor
  9. Switch on the PSU, leave it in the ON position
  10. Switch on the SWEX, you should see green LEDs appear on the PE4H board indicating that it is on. The GPU's fan(s) and PSU fan should also turn on

Checking that Windows sees the eGPU

  1. Boot up Windows
  2. Open up Device Manager
  3. Expand the "Display adapters" branch
  4. You should see your eGPU, with a yellow exclamation mark beside it
  5. Open the eGPU's properties, you should see the Device status = Error 12. At this stage, this is a good sign, it means that Windows detects your eGPU assembly
  6. If Windows tries to install drivers automatically for your eGPU, let it, don't interupt it. We will replace this driver later on anyways

3. Running Setup 1.x

Booting up Setup 1.x

  1. Turn on your laptop
  2. Select Setup 1.x from the Windows Boot manager menu
  3. Select Option 2, Start Setup 1.x in menu mode

Running PCI compaction and disabling dGPU. Creating startup.bat script as we go along. If any of these steps fail at any time, see the section "Troubleshooting My Setup 1.x startup.bat". Failure can include the system locking up after performing an action such as PCI compaction or disabling dGPU.

  • Select "Video Card > Hybrid GFX > dGPU [off]" to disable dGPU. Hit F3 to add to startup.bat script.
  • If you have PE4H+PM3N, then select "PCIe Port > Link Speed > G1". F3 to add to startup.bat
  • Select "Video Card > Initialize". F3 to add to startup.bat
  • Select "PCI compaction > method 32-bit / 32-bitA" so that the "pci_alloc_valid=yes". F3 to add to startup.bat
  • Select "Chainloader > win7". F3 to add to startup.bat
  • Select "startup.bat > !Speedup". F3 to add to startup.bat
  • Select "Run startup.bat". This will run all the commands you have added to your startup.bat script. It should run successfully, and prompt you to hit "[Enter]" when it's done.
  • Select "Chainload win7" to chainload to Windows 7 and automatically reboot
  • When at the Windows boot manager, select Windows 7. The previous startup.bat script you just ran has been chainloaded to the Windows startup event. So your dGPU should be disabled, and your eGPU ready to go

4. Getting the eGPU working in Windows

The first thing you should notice is that your Windows will still show the boot sequence on your internal laptop screen. However, once you reach the login screen the eGPU should automatically kick in and display to your external monitor. If it doesn't do this automatically, don't fret, it might just be Windows deciding to leave your internal laptop LCD as the primary display device.

  1. Run setup 1.x and automatically run startup.bat, chainload to Windows
  2. Log into Windows
  3. Check Device Manager; your eGPU should now show up without the Error 12. If you see Error 12 still, see the section "Troubleshooting My Setup 1.x startup.bat".
  4. If not already outputing to external monitor, right-click on your desktop and open Screen Resolution. Set your external display as your primary screen.
  5. Download Nvidia drivers for your eGPU, install the new drivers
  6. Reboot, run startup.bat and chainload again

And that's it. You should now have a working eGPU. It's not too difficult, but it's time consuming. The payoff is huge, however, as we can now use the desktop Nvidia driver on the mobile GT 540M as well. So if you need to pack up your laptop and travel, you still have full use of the dGPU!

You'll likely have a lot of wires hanging about. It would be wise to collect these stray cables and try to keep them organised with cable ties or using some sort of box. I've found that a shoe box holds the PSU and GPU very well, and helps keep the cables out of the way.

  • Thumbs Up 9
Link to comment
Share on other sites

In this post:

Troubleshooting my Setup 1.x problems

Benchmarks

Pictures

Troubleshooting My Setup 1.x startup.bat

I had a *lot* of trouble with Setup 1.x. The first problem to arise was the "dGPU[off]" command stopped working. While the script

would run, it would immediately freeze when Setup 1.x attempted to return to the main menu. This happened consistently. In order

to resolve this, Nando helped me cherry pick certain troubleshooting steps. Normally a startup.bat script would look something

similar to

 
call iport dGPU off
call iport g1 1  
call vidwait 60 10de:11c6
call vidinit -d 10de:11c6 
call pci 
call chainload mbr

However, as disabling my dGPU was causing trouble, we needed to work around it. So with Nando's help, here is my new startup.bat:

 
Spoiler

setpci -s 1:0.0 COMMAND=0:7 10.l=0,0,0,0,0,0 -s 0:1.0 b0.w=10:10 19.b=0,0 3E.w=0:8 COMMAND=0:7 20.l=0,0
call iportbus force
call iport g1 1  
call vidwait 60 10de:11c6
call vidinit -d 10de:11c6 
call pci 
call chainload mbr

 

 

Note that "call iport dGPU off" has been replaced by the "setpci -s ..." and "call iportbus force". These two lines effectively do

the same what "call iport dGPU off" would do. These values were determined by Nando for me (cannot thank him enough). Also, with

Nando's assistance, I used another version of Setup 1.x to help troubleshooting my problems. I am still uncertain if the different

versions of Setup 1.x have helped my situation, but both versions I've used seem to work now.

There were some other changes made to my DIYEGPUIMG:\core\pci.bat script as well, to test and try alternatives to the "call iport

dGPU off" problem, but the updated startup.bat is essentially what fixed my issue.

Benchmarks (Below are all results running on 1.1Opt)

GPU-Z

gpuz.png

3DMark06

3DMark06 DX9 (Highest of 5): 19935 (click link for 3DMark06 score)

3DMark06 DX9 (Lowest of 5): 19600 (click link for 3DMark06 score)

versus 3DMark06 DX9 on the dGPU Nvidia GT 540M: 8430 (avg of 1st run and 2nd run).

3DMark11

3DMark11 DX9 (Highest of 3): P4204 (click link for 3DMark11 score)

3DMark11 DX9 (Lowest of 3): P4175 (click link for 3DMark11 score)

versus 3DMark11 DX9 on the dGPU Nvidia GT 540M: P1149 (avg of 1st run and 2nd run)

3DMark Vantage

3DMark Vantage DX10 (Highest of 3): P13925 (click link for 3DMark Vantage score)

3DMark Vantage DX10 (Lowest of 3): P13895 (click link for 3DMark Vantage score)

Unigine Heaven

Unigine Heaven DX11 (Highest of 3): 1162

unigine dx11 2xAA_2.png

Unigine Heaven DX11 (Lowest of 3): 1156

unigine dx11 2xAA.png

Resident Evil 5 Benchmark

Resident Evil 5 DX 9.0, Variable (1280x800, Shadow = High, Texture = High, Overall quality = High, AA x2, Motion blur On):

136.8

re5 variable dx9.png

(other benchmarks with higher and lower settings fell between 90 - 150)

versus only 46.7 on the dGPU Nvidia GT 540M:

re5 variable dx9 540.png

Resident Evil 5 DX 10, Variable (1280x800, Shadow = High, Texture = High, Overall quality = High, AA x2, Motion blur On):

108.6

re5 variable dx10.png

(other benchmarks with higher and lower settings fell between 80 - 110)

versus only 59.6 on the dGPU Nvidia GT 540M:

re5 variable dx10 540.png

Pictures

PM3N mini PCI-e card installed, and with mHDMI cable

2013-02-16 15.14.08.jpg

2013-02-16 15.15.29.jpg

2013-02-16 15.17.35.jpg

eGPU enclosure

- Shuttle XPC case

- used VGA cable for demonstration purposes only

Internal PSU side view: Notice the empty space around the PSU. Lots of room for arranging cables, etc (but I'm too lazy to get into that)

IMG_0004.JPG

Internal GPU side view: just the GPU and the 4-pin floppy connector powering the PE4H board. Note the placement of the mHDMI power in the far bottom left corner: this poses a problem with the XPC's enclosure sleeve. I will have to cut out part of the sleeve to make a hole large enough to safely allow the mHDMI cable to be plugged in at all times. Assuming I get a PE4L, this hole should accommodate the board as well

IMG_0006.JPG

Internal top view: bit of a mess, but it's mostly just extra cable sitting on top of the PSU. The PSU power cable has been fed through a couple of cable cable cinches along the inside of the top frame (closest to camera, out of view)

IMG_0005.JPG

Front view: the SWEX board, and a bit of the power cable poking out (to reduce stress on the PSU's input port)

IMG_0008.JPG

Front panel: that's the slot where the 3.5-inch floppy drive would normally go. It's perfectly sized to allow easy movement of the SWEX board and PSU 24-pin cable, and just large enough that I can tuck in/pull out the SWEX board for quick packing.

IMG_0007.JPG

For storage, I can unplug the power cable, and place it as well as the SWEX board inside the case. Makes it extremely easy to pack up and move around, or just pack away altogether

IMG_0009.JPG

Rear view. Note that I can very easily tuck away the power cable into the larger horizontal hole at the bottom. Makes it extremely easy to pack up. (Power cable exits the enclosure through the 1.5x3 inch hole on the left of the back panel)

IMG_0010.JPG

When cinched up, the video cable (DVI or VGA, but not both) can be tucked away into the enclosure, through the hole that the power cable comes out of. Plenty of room inside, beside the PSU.

IMG_0011.JPG

  • Thumbs Up 6
Link to comment
Share on other sites

  • 2 weeks later...

Hi !

Thanks for the detailed howto. I have a very similar laptop and I am about to pull the trigger on eGPU but I am planning to use it on the laptop screen. Is the performance drop between external display and laptop screen significant ? Also, is PE4L-PM060A a better option ?

Link to comment
Share on other sites

Hi !

Thanks for the detailed howto. I have a very similar laptop and I am about to pull the trigger on eGPU but I am planning to use it on the laptop screen. Is the performance drop between external display and laptop screen significant ? Also, is PE4L-PM060A a better option ?

I actually haven't done any real comparisons between internal and external displays. It's well known that the internal display will have lower performance, but by how much? that is currently not well documented. I'll give it a try tonight, and see what I can get. Benchmarks are only half of the truth, but I'll run the RE5 benchmark in DX9 and let you know.

As for the PE4L, currently it is the best option because it is Gen2 compatible (PCIe 2.0), which means 1.2Opt performance. This also translates into better internal display performance, but still lower than performance if using an external display. It is well documented that the PE4L is Gen2 capable, so that's the safest route to go.

In terms of setting up the PE4L for your Dell L502, it should be "exactly the same" as what I've done - just keep in mind that I had some weird problems using Setup 1.x (still having some funky issues with it freezing on me). The work arounds I'm using still work, just that it's not as smooth for me as it has been for almost everybody else.

UPDATE:

I ran the DX9 Resident Evil 5 on my internal screen. See below for the results

post-8815-1449499429026_thumb.png

Basically I got less than 30 fps using the same benchmark settings as in my 2nd post. The performance decrease is immense (100 fps!), but I think it's because of the fact that I am on 1.1Opt instead of 1.2Opt.

Hopefully someone else with 1.2Opt can run a benchmark on their internal screen as well.

  • Thumbs Up 2
Link to comment
Share on other sites

Wow... that's quite a FPS drop there... From what I gather from the NBR egpu thread, the performance decrease between internal screen and external display on 1.2Opt could be anywhere between 2% and 120%... that's a bit discouraging... Anyway if I'm gonna do this, I'll start a new thread and post my results. Many thanks again for taking the time to run the benchmarks and post back.

Link to comment
Share on other sites

I actually haven't done any real comparisons between internal and external displays. It's well known that the internal display will have lower performance, but by how much? that is currently not well documented. I'll give it a try tonight, and see what I can get. Benchmarks are only half of the truth, but I'll run the RE5 benchmark in DX9 and let you know.

It's well documented within http://forum.techinferno.com/diy-e-gpu-projects/2747-12-5-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197 . You'd see nearly doubling of your RE5 internal LCD scores going from x1.1Opt to x1.2Opt. You are being severly bandwidth restricted atm for that benchmark.

Link to comment
Share on other sites

Wow... that's quite a FPS drop there... From what I gather from the NBR egpu thread, the performance decrease between internal screen and external display on 1.2Opt could be anywhere between 2% and 120%... that's a bit discouraging... Anyway if I'm gonna do this, I'll start a new thread and post my results. Many thanks again for taking the time to run the benchmarks and post back.

You definitely want to go with the PE4L kit, as it allows 1.2Opt.

From my RE5 test on 1.1Opt, I am running the benchmark at highest settings, though at a lower resolution. I'm played a couple rounds of Mechwarrior Online on my internal screen, and had a consistent 30-35 fps, with a couple dips down to 20fps when my 'mech was hit by a triple LRM volley. My settings in MWO are also on the high side.

I would say that internal LCD is still decent performance, but nothing near the expected performance from a desktop GPU running 1.2Opt.

As Nando posted below, the difference between 1.2Opt and 1.1Opt is about equally as staggering. The lower fps in his benchmark might only be due to the lower processor. As your system is nearly identical to mine, I believe you would receive at least as good framerates at 130fps or higher in the RE5 variable benchmark (with same settings).

Link to comment
Share on other sites

Interesting! I have a Dell XPS 15 (L502X) as well but my CPU is i7-2630QM instead. How does your eGPU perform? Is it bottlenecked a lot or does it run better than the dGPU GT540M?

Also, another question is if I bought a graphics card like GTX690 or GTX680 (Just a question. Prob not going to do it), how would I power the card? I know GTX 680 needs 2 6-pin connectors and GTX690 needs 2-8pin connectors. Where would I find the plugs? (I'm thinking about the external supply cable instead of an actual power unit...how would it attach from the PE4H card to the graphics card if there's only 1 slot for the external supply cable?)

Some of these are pretty noob questions ...I only found and started reading upon this stuff 3 days ago and so far I've gotten my most confusing questions out of the way. Thanks for reading!

Link to comment
Share on other sites

Yay! Another Nuckie! (I'm from BC)

Interesting! I have a Dell XPS 15 (L502X) as well but my CPU is i7-2630QM instead. How does your eGPU perform? Is it bottlenecked a lot or does it run better than the dGPU GT540M?

Your processor might as well be the same as mine so I don't consider it a bottleneck, except in games that demand high CPU performance (mostly RTS). My eGPU runs really well, but is bottlenecked by my 1.1Opt. For example, while I can get really great framerates in games, anywhere from 30-120+fps depending on different gfx settings (resolution, texture, anisotropic filtering, anti-aliasing, DX11 tessellation, etc). I've yet to have any game run me below 30fps unless I had full DX11 and 1920x1080. However, I know that I can get better results if I had full 1.2Opt running.

You simply cannot compare the eGPU to the dGPU GT 540M (I am going to run some benchmarks on the native GT 540M this week, for comparison between eGPU and dGPU). The eGPU blows the GT 540M out of the water, despite only being on 1.1Opt right now!

Also, another question is if I bought a graphics card like GTX690 or GTX680 (Just a question. Prob not going to do it), how would I power the card? I know GTX 680 needs 2 6-pin connectors and GTX690 needs 2-8pin connectors. Where would I find the plugs? (I'm thinking about the external supply cable instead of an actual power unit...how would it attach from the PE4H card to the graphics card if there's only 1 slot for the external supply cable?)

(I know you probably won't buy a 680/690, but thought I'd address it all the same) I wouldn't bother getting a card as high-end as the 690, since eGPU will still be a bit of a bottleneck. Of course the 690 will perform better than a 660 or 670, but I don't think you'll get the full benefit of such a powerful GPU in an eGPU setup.

As recommended by Nando, and some other users out there, something like the GTX 460/560/660 is likely your best bang for buck card. Plenty of power from the GPU, yet doesn't require a really high wattage PSU. And a helluva lot cheaper for that matter.

Now as for the new PE4H 3.2, I don't know how the power supplies work with that. I only know that with the PE4H 2.4 kit you would supply power to the eGPU just like you normally would on a desktop (use the PSU's own PCI-e power cables to supply power to the GPU card.

For powering the desktop GPU card within the eGPU, there usually is a port on the GPU card itself. See this image for example. THough the cable is not plugged in, you can still see the ports that the cables plug into. This here is a 2 x 6-pin, like the GTX 680 you mentioned.

The way power cables work are usually:

-PSU plugs into GPU

-PSU plugs into SWEX board or other power switch which controls power on/off for the entire eGPU unit

-PSU plugs into PE4H board

-PSU gets power from power source

If you've ever built a desktop system before, then powering an eGPU is not that much different. Treat the GPU normally, treat the PSU normally, but pretend that the PE4H is the motherboard and supply power to it using a 4-pin floppy cable, or molex-to-floppy adapter.

Some of these are pretty noob questions ...I only found and started reading upon this stuff 3 days ago and so far I've gotten my most confusing questions out of the way. Thanks for reading!

No such thing as a bad question, so please, ask away! Don't forget about the main thread that's stickied in this section. There's a lot of folks who are really helpful around these parts here, so please do ask your questions about anyhting you're not certain of.

  • Thumbs Up 1
Link to comment
Share on other sites

Yay! Another Nuckie! (I'm from BC)

LOL Sweet ~ My home is in Vancouver but I'm in Alberta just for study purposes. I'm in Alberta more than half the time so I just put Alberta for the heck of it XD

Your processor might as well be the same as mine so I don't consider it a bottleneck, except in games that demand high CPU performance (mostly RTS). My eGPU runs really well, but is bottlenecked by my 1.1Opt. For example, while I can get really great framerates in games, anywhere from 30-120+fps depending on different gfx settings (resolution, texture, anisotropic filtering, anti-aliasing, DX11 tessellation, etc). I've yet to have any game run me below 30fps unless I had full DX11 and 1920x1080. However, I know that I can get better results if I had full 1.2Opt running.

You simply cannot compare the eGPU to the dGPU GT 540M (I am going to run some benchmarks on the native GT 540M this week, for comparison between eGPU and dGPU). The eGPU blows the GT 540M out of the water, despite only being on 1.1Opt right now!

Omg seriously? You can run over 30fps for everything?!?! GT540M running things like Dead Space 3 can't get over 30FPS I think. I even OC'ed my GT540M LOL..it wasn't meant to be OC'ed but I didn't know about eGPU until 3 days ago. I had the thought of it years ago but it never existed back then.

(I know you probably won't buy a 680/690, but thought I'd address it all the same) I wouldn't bother getting a card as high-end as the 690, since eGPU will still be a bit of a bottleneck. Of course the 690 will perform better than a 660 or 670, but I don't think you'll get the full benefit of such a powerful GPU in an eGPU setup.

Oh yeah well... I was thinking of when this laptop dies, I would be able to get a laptop with Thunderbolt port. I'm not allowed to get a tower you see... so my only option is eGPU. I never thought that the bottleneck would be this small... I guess just calculating numbers isn't comparable to practical uses.

As recommended by Nando, and some other users out there, something like the GTX 460/560/660 is likely your best bang for buck card. Plenty of power from the GPU, yet doesn't require a really high wattage PSU. And a helluva lot cheaper for that matter.

Oh um.. is the mini-PCI-e on the L502x where the Wifi adapter is? I'm not too sure how to get to it.. and are you playing on internal LCD monitor? That's what I intend to do later on for my next laptop. I want Thunderbolt port + Internal LCD display.

Now as for the new PE4H 3.2, I don't know how the power supplies work with that. I only know that with the PE4H 2.4 kit you would supply power to the eGPU just like you normally would on a desktop (use the PSU's own PCI-e power cables to supply power to the GPU card.

For powering the desktop GPU card within the eGPU, there usually is a port on the GPU card itself. See this image for example. THough the cable is not plugged in, you can still see the ports that the cables plug into. This here is a 2 x 6-pin, like the GTX 680 you mentioned.

The way power cables work are usually:

-PSU plugs into GPU

-PSU plugs into SWEX board or other power switch which controls power on/off for the entire eGPU unit

-PSU plugs into PE4H board

-PSU gets power from power source

If you've ever built a desktop system before, then powering an eGPU is not that much different. Treat the GPU normally, treat the PSU normally, but pretend that the PE4H is the motherboard and supply power to it using a 4-pin floppy cable, or molex-to-floppy adapter.

Oh yea I've seen it and understand the general gist of how it works. I was thinking that the adapter would need to have external supply cables attached to the pin-ports on the graphics card. I guess that isn't needed in this case. Last time I had a tower was probably till I was 12 years old and it wasn't a gaming tower or anything. I was just barely learning about the basics of a computer and such. I didn't have too much of an interest in anything like this. All I did was play games hahaha.

No such thing as a bad question, so please, ask away! Don't forget about the main thread that's stickied in this section. There's a lot of folks who are really helpful around these parts here, so please do ask your questions about anyhting you're not certain of.

Thanks for helping out! I appreciate all the answers and information that you've provided so far :D It really helps clear the confusing questions in my mind or logic that doesn't technically work out to be viable.

About what you said for the PE4L kit and it allowing 1.2 Opt, if I were to buy a new laptop some other year where Thunderbolt ports would be more common, I would most likely have to change PE4L to PE4H right? I don't really know the difference between the two and such but I'm working on reading up on whatever I can find.

Link to comment
Share on other sites

LOL Sweet ~ My home is in Vancouver but I'm in Alberta just for study purposes. I'm in Alberta more than half the time so I just put Alberta for the heck of it XD

You'd better be wearing your blues and greens to them Flames/Oilers games...

Omg seriously? You can run over 30fps for everything?!?! GT540M running things like Dead Space 3 can't get over 30FPS I think. I even OC'ed my GT540M LOL..it wasn't meant to be OC'ed but I didn't know about eGPU until 3 days ago. I had the thought of it years ago but it never existed back then.

Well, in full context, with my eGPU I can get 30+ fps in almost all games. Again, that is dependent on the GFX settings I use. If I'm at medium - high and 1920x1080 I'm almost guaranteed to get a flat 60fps with V-Sync. If I turn everything on to max (AA, DX11, motion blur, full ambient occlusion, full volumetric lighting) then things slow down drastically, and I'll see frame rates at 10-40 fps. But let's be realistic, I'm only using a GTX 650 Ti, not a GTX 690.

The GT 540M is a very good mid-range mobile GPU. It's just not nearly as powerful as a desktop GPU. The improvement between the GT 540M and even the old desktop 8800 Ultra is staggering. For fun I pulled out and tested my 8800 Ultra, and I was still able to get a solid steady 45-50 fps in Borderlands with almost full gfx settings. Using those same gfx settings I eeked out maybe 20-40 fps with the GT 540M.

Oh yeah well... I was thinking of when this laptop dies, I would be able to get a laptop with Thunderbolt port. I'm not allowed to get a tower you see... so my only option is eGPU. I never thought that the bottleneck would be this small... I guess just calculating numbers isn't comparable to practical uses.

Well, right now it is not worth getting a high end GPU because of the bottleneck imposed by the actual eGPU kit itself. The PE4L/PE4H is only PCI-e 2.0 compliant. These new high end GPU cards are PCI-e 3.0, so although you are missing out on some performance, the performance hit isn't really that big. Have a look at this recent look at PCI-e 2.0 vs PCI-e 3.0.

You'll still get some great performance out of a GTX 690, but unless you want to put down the over $500 CAD on one, and want to have 3 monitors going at once, you're better off paying a third of the cost and getting a 660 Ti. You'll also not have to worry about finding a really powerful PSU too :)

Oh um.. is the mini-PCI-e on the L502x where the Wifi adapter is? I'm not too sure how to get to it.. and are you playing on internal LCD monitor? That's what I intend to do later on for my next laptop. I want Thunderbolt port + Internal LCD display.

Yes, if you open up the panel on the underside of the L502x, there are two mPCI-e slots: one is in use by the WiFi card, and there is another free slot that says "TV/WWAN" in big white letters. That second slot is what I'm using. Fits perfectly too.

When I'm gaming I use only my 22" external monitor, but when I'm working I use both external and internal. With eGPU, you'll definitely be able to do internal LCD. However, regarding Thunderbolt you should definitely keep up to date with what's going on in the eGPU world. There's a back and forth problem with TB and probably a licensing issue. I'm not fully read up on it, but that's my understanding so far.

Oh yea I've seen it and understand the general gist of how it works. I was thinking that the adapter would need to have external supply cables attached to the pin-ports on the graphics card. I guess that isn't needed in this case. Last time I had a tower was probably till I was 12 years old and it wasn't a gaming tower or anything. I was just barely learning about the basics of a computer and such. I didn't have too much of an interest in anything like this. All I did was play games hahaha.

I understand that a lot of people prefer to buy a pre-manufactured power brick much akin to the Xbox 360 power brick. I have no qualms with it, as it's very convenient to store away, and looks a lot less like a medusa of cables. Personally, I'd rather go with a desktop PSU because it can serve multiple purposes. I've been considering getting a new desktop to replace my current one, and both the GPU and PSU can be easily transplanted to the desktop. And as Nando points out in his eGPU briefing, I can timeshare the two!

Thanks for helping out! I appreciate all the answers and information that you've provided so far :D It really helps clear the confusing questions in my mind or logic that doesn't technically work out to be viable.

Glad to help out in any way I can :)

Nando and the others have done an absolutely stellar job with getting users up and running, and getting eGPU on the horizon as "possible DIY" projects. The only caveat is that you must be patient in case you run into trouble, and be willing to try out all sorts of different things.

About what you said for the PE4L kit and it allowing 1.2 Opt, if I were to buy a new laptop some other year where Thunderbolt ports would be more common, I would most likely have to change PE4L to PE4H right? I don't really know the difference between the two and such but I'm working on reading up on whatever I can find.

Sorry, I can't help you there, because I just don't know what kinds of connections that future eGPU kits will offer.

For example, right now the PE4H is modular so that you can plug it into an ExpressCard slot, mini PCI-e slot, and probably even TB (I don't know this for sure). In the future, like say the PE4H 4.0, that kit might come with all kinds of connections like an all-in-one, so that you only need to purchase that one kit and be done with it. Of course that's just my dreaming (like anybody would ever allow such a licensed product!), but you never know!

I can only say for certain that right now the PE4H/PE4L only work with PCI-e slots, and there is another set of ThunderBolt exclusive products out there that only work on THunderBolt ports.

  • Thumbs Up 1
Link to comment
Share on other sites

You'd better be wearing your blues and greens to them Flames/Oilers games...
LOL HAHAHAHA yea have people looking at me all funny like O___o?
Well, in full context, with my eGPU I can get 30+ fps in almost all games. Again, that is dependent on the GFX settings I use. If I'm at medium - high and 1920x1080 I'm almost guaranteed to get a flat 60fps with V-Sync. If I turn everything on to max (AA, DX11, motion blur, full ambient occlusion, full volumetric lighting) then things slow down drastically, and I'll see frame rates at 10-40 fps. But let's be realistic, I'm only using a GTX 650 Ti, not a GTX 690.

The GT 540M is a very good mid-range mobile GPU. It's just not nearly as powerful as a desktop GPU. The improvement between the GT 540M and even the old desktop 8800 Ultra is staggering. For fun I pulled out and tested my 8800 Ultra, and I was still able to get a solid steady 45-50 fps in Borderlands with almost full gfx settings. Using those same gfx settings I eeked out maybe 20-40 fps with the GT 540M.

Well, right now it is not worth getting a high end GPU because of the bottleneck imposed by the actual eGPU kit itself. The PE4L/PE4H is only PCI-e 2.0 compliant. These new high end GPU cards are PCI-e 3.0, so although you are missing out on some performance, the performance hit isn't really that big. Have a look at this recent look at PCI-e 2.0 vs PCI-e 3.0.

You'll still get some great performance out of a GTX 690, but unless you want to put down the over $500 CAD on one, and want to have 3 monitors going at once, you're better off paying a third of the cost and getting a 660 Ti. You'll also not have to worry about finding a really powerful PSU too :)

Oh yeah um..does your laptop have Bluetooth? Maybe that's how you have a mini-PCI-e slot? And yea the PE4L and PE4H is only PCI-e 2.0 so yea... I was already thinking of getting a MSI GeForce GTX 660 Twin Frozr 2GB GDDR5. Compared to the Asus model using DirectCU II, this MSI one seems to be the better deal. Asus GeForce GTX 660 DirectCU II OC 2GB GDDR5

Yes, if you open up the panel on the underside of the L502x, there are two mPCI-e slots: one is in use by the WiFi card, and there is another free slot that says "TV/WWAN" in big white letters. That second slot is what I'm using. Fits perfectly too.

Nvm about the question above this quote LOL ~ I'm reading and replying as I go along just cause it's efficient. I'm gonna open up my laptop....

When I'm gaming I use only my 22" external monitor, but when I'm working I use both external and internal. With eGPU, you'll definitely be able to do internal LCD. However, regarding Thunderbolt you should definitely keep up to date with what's going on in the eGPU world. There's a back and forth problem with TB and probably a licensing issue. I'm not fully read up on it, but that's my understanding so far.

Oh yep I'm just catching up from the things that I missed out on... (searching up things and finding dead ends) so I kinda know what's going on. Desktop graphics for your laptop using thunderbolt or expresscard

I understand that a lot of people prefer to buy a pre-manufactured power brick much akin to the Xbox 360 power brick. I have no qualms with it, as it's very convenient to store away, and looks a lot less like a medusa of cables. Personally, I'd rather go with a desktop PSU because it can serve multiple purposes. I've been considering getting a new desktop to replace my current one, and both the GPU and PSU can be easily transplanted to the desktop. And as Nando points out in his eGPU briefing, I can timeshare the two!

Ah I see.. Well I was looking at um.. CORSAIR Builder Series CX500 500W ATX12V v2.3 80 PLUS BRONZE Certified Active PFC Power Supply - Newegg.com since I see all those wires with different pins. The GTX660 uses 2 6-pin connectors and from the details of that link, I'm just wondering what it's for. About the Xbox 360 power brick, I'll have a look at that too! I never actually thought of that. I thought, in order to power a graphics card, I would need the exact pin connectors to make it run. 2 6-pins are 150 watts together. The PE4H is 75 watts. Altogether that's 225 watts.

Don't you need to plug-in both 6-pin connectors into the graphics card for it to run? So an Xbox Power brick can be plugged into one of the 6-pins to make the entire card run?

Sorry, I can't help you there, because I just don't know what kinds of connections that future eGPU kits will offer.

For example, right now the PE4H is modular so that you can plug it into an ExpressCard slot, mini PCI-e slot, and probably even TB (I don't know this for sure). In the future, like say the PE4H 4.0, that kit might come with all kinds of connections like an all-in-one, so that you only need to purchase that one kit and be done with it. Of course that's just my dreaming (like anybody would ever allow such a licensed product!), but you never know!

I can only say for certain that right now the PE4H/PE4L only work with PCI-e slots, and there is another set of ThunderBolt exclusive products out there that only work on THunderBolt ports.

Ah ~ yea have a look at that "Desktop graphics for your laptop using thunderbolt port". I believe it works and is very cost efficient assuming that we all already have the PE4H kits ready and running. Thanks for your post again :D ~ Pretty nice read. I'm getting excited just by reading all of this stuff LOL

EDIT: Few things I almost forgot! Did you have to take any special steps to make your setup work or was it pretty much plug-in and TADA!!?

I read something that Tech Inferno Fan posted in the previous forum DIY eGPU experiences - Page 894 where if you have too much RAM, you would have to go through this process.

Also, did you have to go through something called "Chainload Windows"? (something along that line) Just wanting to make sure about the little details I don't know about yet and hoping you could hint me to the right direction. ie. Disabling GT540M, restarting windows and turning on eGPU to make it work (I think it's something like that just to get internal LCD working)

Link to comment
Share on other sites

Oh yeah um..does your laptop have Bluetooth? Maybe that's how you have a mini-PCI-e slot? And yea the PE4L and PE4H is only PCI-e 2.0 so yea... I was already thinking of getting a MSI GeForce GTX 660 Twin Frozr 2GB GDDR5. Compared to the Asus model using DirectCU II, this MSI one seems to be the better deal. Asus GeForce GTX 660 DirectCU II OC 2GB GDDR5

...

Nvm about the question above this quote LOL ~ I'm reading and replying as I go along just cause it's efficient. I'm gonna open up my laptop....

Yes, my model does have Bluetooth, but that's something that comes with the Intel 6230 Wifi/BT card. That said, the card is only using a single mPCI-e slot in my machine, so my open slot is used by the PM3N card for my eGPU.

Ah I see.. Well I was looking at um.. CORSAIR Builder Series CX500 500W ATX12V v2.3 80 PLUS BRONZE Certified Active PFC Power Supply - Newegg.com since I see all those wires with different pins. The GTX660 uses 2 6-pin connectors and from the details of that link, I'm just wondering what it's for. About the Xbox 360 power brick, I'll have a look at that too! I never actually thought of that. I thought, in order to power a graphics card, I would need the exact pin connectors to make it run. 2 6-pins are 150 watts together. The PE4H is 75 watts. Altogether that's 225 watts.

Don't you need to plug-in both 6-pin connectors into the graphics card for it to run? So an Xbox Power brick can be plugged into one of the 6-pins to make the entire card run?

The 2x 6-pin PCI-e connectors are for powering the GPU up. GPUs these days are so powerful that they literally require their own power source, which is what these 2x 6-pins are for. My GTX 650 Ti is much less power hungry than the 660+, which is why I only need 1x 6-pin.

The Xbox 360 power brick isn't just a plug and play solution, it requires you to cut off the end of the cable that plugs into the Xbox, and hardmod the wires to allow the proper pins and plugs. While not a difficult task, it's not for users who don't want to tinker and put effort into it either.

In your case, I think you would need a full desktop PSU, as it sounds like you'll need the 2x 6-pin PCI-e power cables for the GTX 660 or higher. I don't know if the Xbox 360 power brick can even supply 2x 6-pins.

The Corsair PSU that you linked looks like a great candidate, though a bit too much power for just an eGPU, unless you're getting it on a really great discount. (You could easily get away with just a 400W PSU.)

EDIT: Few things I almost forgot! Did you have to take any special steps to make your setup work or was it pretty much plug-in and TADA!!?

I read something that Tech Inferno Fan posted in the previous forum DIY eGPU experiences - Page 894 where if you have too much RAM, you would have to go through this process.

I did, I had to go through a lot of troubleshooting for my specific rig. I don't know why it didn't behave "normally" during my process. Normally within the Setup 1.x software there is a "turn dGPU [off]" call that disables the onboard dGPU GT 540M. This call doesn't work on my machine: it freezes it up and I cannot proceed. Nando helped me get past this by providing me with a customised call that effectively does the exact same thing as "turn dGPU [off]", but on a bit more granular level.

As your machine is much like mine, you will have to deal with the "too much RAM" issue. Booting up straight into Windows with the eGPU will give you an Error 12 in Device Manager on your eGPU. Your dGPU will function perfectly fine, however.

To get around teh Error 12 you must compact your PCI space to make room for the eGPU assembly. This is what the Setup 1.x software is for, and this where my troubles were.

The overall process for installing the eGPU for me (and should likely be for you too) is:

1. plug GPU into PE4L/PE4H board

2. plug PSU into PE4L/PE4H board

3. plug PSU into SWEX

4. plug PM3N into mPCIe slot in L502x

5. connect PE4L/PE4H to PM3N via supplied mHDMI cable

6. power up SWEX (there's an onboard hardware switch on the SWEX board)

7. power up laptop

And then the overall process for using the eGPU for me is:

1. power up eGPU (SWEX switch)

2. power up laptop

3. at boot menu (first time only!) select Setup 1.x

4. start Setup 1.x in menu mode

5. run startup.bat script

6. Chainload MBR

7. at boot menu (second time) select Windows 7 (defaulted choice)

8. boot Windows

Also, did you have to go through something called "Chainload Windows"? (something along that line) Just wanting to make sure about the little details I don't know about yet and hoping you could hint me to the right direction. ie. Disabling GT540M, restarting windows and turning on eGPU to make it work (I think it's something like that just to get internal LCD working)

Yes, I do have to Chainload everytime I want to use the eGPU. Chainloading is basically retaining your changes to the dGPU/eGPU/PCI space and using them for the next bootup of your OS. So for example, if I reboot my L502x right now, run Setup 1.x, and don't "Chainload MBR" I will be using my onboard dGPU. This is because without chainloading the startup.bat script, the system isn't retaining the changes to my dGPU/eGPU/PCI and so Windows will boot up and run like normal with my dGPU. However, if I reboot now, run Setup 1.x and Chainload MBR, then Windows will boot up with my eGPU. This is because the Chainloading has told the system that changes have been made to the dGPU/eGPU/PCI and so WIndows will see these changes and run with them.

Hopefully that answers your questions!

  • Thumbs Up 1
Link to comment
Share on other sites

Yes, my model does have Bluetooth, but that's something that comes with the Intel 6230 Wifi/BT card. That said, the card is only using a single mPCI-e slot in my machine, so my open slot is used by the PM3N card for my eGPU.

Ohh I was thinking that if you had a model that has Bluetooth, it would come with a slot with a TV tuner or something like that. I guess not lol.

The 2x 6-pin PCI-e connectors are for powering the GPU up. GPUs these days are so powerful that they literally require their own power source, which is what these 2x 6-pins are for. My GTX 650 Ti is much less power hungry than the 660+, which is why I only need 1x 6-pin.

The Xbox 360 power brick isn't just a plug and play solution, it requires you to cut off the end of the cable that plugs into the Xbox, and hardmod the wires to allow the proper pins and plugs. While not a difficult task, it's not for users who don't want to tinker and put effort into it either.

In your case, I think you would need a full desktop PSU, as it sounds like you'll need the 2x 6-pin PCI-e power cables for the GTX 660 or higher. I don't know if the Xbox 360 power brick can even supply 2x 6-pins.

The Corsair PSU that you linked looks like a great candidate, though a bit too much power for just an eGPU, unless you're getting it on a really great discount. (You could easily get away with just a 400W PSU.)

Ah I see.. I wouldn't mind cutting wires and attaching pins and plugs if I had proper tools. If there's already precedent where these things already work, I don't see why not try it.

I did, I had to go through a lot of troubleshooting for my specific rig. I don't know why it didn't behave "normally" during my process. Normally within the Setup 1.x software there is a "turn dGPU [off]" call that disables the onboard dGPU GT 540M. This call doesn't work on my machine: it freezes it up and I cannot proceed. Nando helped me get past this by providing me with a customised call that effectively does the exact same thing as "turn dGPU [off]", but on a bit more granular level.

As your machine is much like mine, you will have to deal with the "too much RAM" issue. Booting up straight into Windows with the eGPU will give you an Error 12 in Device Manager on your eGPU. Your dGPU will function perfectly fine, however.

Hmm.. the day I get this all my pieces and parts, I'll most likely have the same problem as you and will probably be needing that "fix" as well. About the Thunderbolt eGPU, I read up a bit more and related the scenario of having internal display with Thunderbolt eGPU. I don't think it's possible because one person stated that Nvidia Optimus Technology only works on x1 setup. I don't think Thunderbolt is a x1 setup as it does transfer data faster than mini-PCI-e and express card slots. Maybe when Thunderbolt gets more generalized within the market, I think monitors with Thunderbolt will be more popular and by then..hopefully there will be another way to play on an interal LCD monitor.

To get around teh Error 12 you must compact your PCI space to make room for the eGPU assembly. This is what the Setup 1.x software is for, and this where my troubles were.

The overall process for installing the eGPU for me (and should likely be for you too) is:

1. plug GPU into PE4L/PE4H board

2. plug PSU into PE4L/PE4H board

3. plug PSU into SWEX

4. plug PM3N into mPCIe slot in L502x

5. connect PE4L/PE4H to PM3N via supplied mHDMI cable

6. power up SWEX (there's an onboard hardware switch on the SWEX board)

7. power up laptop

And then the overall process for using the eGPU for me is:

1. power up eGPU (SWEX switch)

2. power up laptop

3. at boot menu (first time only!) select Setup 1.x

4. start Setup 1.x in menu mode

5. run startup.bat script

6. Chainload MBR

7. at boot menu (second time) select Windows 7 (defaulted choice)

8. boot Windows

Yes, I do have to Chainload everytime I want to use the eGPU. Chainloading is basically retaining your changes to the dGPU/eGPU/PCI space and using them for the next bootup of your OS. So for example, if I reboot my L502x right now, run Setup 1.x, and don't "Chainload MBR" I will be using my onboard dGPU. This is because without chainloading the startup.bat script, the system isn't retaining the changes to my dGPU/eGPU/PCI and so Windows will boot up and run like normal with my dGPU. However, if I reboot now, run Setup 1.x and Chainload MBR, then Windows will boot up with my eGPU. This is because the Chainloading has told the system that changes have been made to the dGPU/eGPU/PCI and so WIndows will see these changes and run with them.

Hopefully that answers your questions!

This. I probably gotta save this somewhere for absolute reference LOL. How did you power your PE4H adapter? Did you grab something like an old laptop charger to power it? I'm thinking the pin connector PSU would not power the adapter the same way because the PSU would be too powerful for the adapter. I'm pretty sure the adapter, itself, has limitations.

Oh! About the PE4L board... Would connecting the smaller side of the card affect performance? I'm pretty sure the PE4L card works better than PE4H from reading more upon it. It just seems kinda weird with the logic that less physical connection to the card = better performance.

If the PM3N is beneath your laptop, doesn't your laptop end up having a slight bump? Did you cut a hole for your cover?

I'm going to start referring back to your first post of this thread as to the setup since... you already wrote it down and I probably got too excited and ran straight to replying. About your internal LCD test, "2Optimus tweak engages only when Intel 4500MHD/HD/HD3000/HD4000 primary graphics AND x1 link is detected, improving performance by 20-333%. Provides ability for the internal LCD to render games running on the external GPU."

Source: http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#firststeps

Shouldn't you have better performance on internal LCD?

Link to comment
Share on other sites

Ohh I was thinking that if you had a model that has Bluetooth, it would come with a slot with a TV tuner or something like that. I guess not lol.

Nope, my understanding is that all L502x (and probably L702x for that matter) have 2 mPCI-e slots, one is almost always occupied by the WLAN card, and the other is usually used by the TV Tuner/WWAN card if you ordered one. Otherwise, it sits free of anything.

Ah I see.. I wouldn't mind cutting wires and attaching pins and plugs if I had proper tools. If there's already precedent where these things already work, I don't see why not try it.

There are guides here and there about modding a Xbox 360 power supply. Sorry but I can't point you to one because frankly I don't know exactly where they are! Your best bet is to ask in the main DIY Experience thread. It's definitely been done with much success by other people in the past, so we both know for sure that it's a viable option for eGPU. However, you should double check with people that the Xbox 360 mod will allow you to provide power over 2x 6-pin. This is a must if you want to get a GTX _60 or higher GPU, as they require much more power than the lower cards.

Hmm.. the day I get this all my pieces and parts, I'll most likely have the same problem as you and will probably be needing that "fix" as well. About the Thunderbolt eGPU, I read up a bit more and related the scenario of having internal display with Thunderbolt eGPU. I don't think it's possible because one person stated that Nvidia Optimus Technology only works on x1 setup. I don't think Thunderbolt is a x1 setup as it does transfer data faster than mini-PCI-e and express card slots. Maybe when Thunderbolt gets more generalized within the market, I think monitors with Thunderbolt will be more popular and by then..hopefully there will be another way to play on an interal LCD monitor.

Indeed, I read briefly that ThunderBolt eGPU solutions are kind of a two-sided coin: fantastic that you have such a high data rate solution, but also troublesome because it currently doesn't support some things that "older" hardware can handle. My guess is that the latter is purely because it is so new.

I didn't know that Optimus only works under x1 - x16; I assume ThunderBolt won't do Optimus then (speculation only from the aforementioned info) because x1-x16 is used to describe the PCI bus.

This. I probably gotta save this somewhere for absolute reference LOL. How did you power your PE4H adapter? Did you grab something like an old laptop charger to power it? I'm thinking the pin connector PSU would not power the adapter the same way because the PSU would be too powerful for the adapter. I'm pretty sure the adapter, itself, has limitations.

Well, this thread isn't going anywhere :) It'll only disappear if the entire forum goes down

I power my PE4H board with the desktop PSU. Steps #2 and #3 basically mean: #2 use PSU to provide power to the PCI board like you would normally provide power to a desktop motherboard, and #3 SWEX is like the on/off switch for the entire eGPU kit.

Oh! About the PE4L board... Would connecting the smaller side of the card affect performance? I'm pretty sure the PE4L card works better than PE4H from reading more upon it. It just seems kinda weird with the logic that less physical connection to the card = better performance.

If the PM3N is beneath your laptop, doesn't your laptop end up having a slight bump? Did you cut a hole for your cover?

No, connecting the "smaller side of the card" doesn't affect performance negatively. Don't think of the physical dimensions as having any impact on the overall performance. Ever heard of Moore's Law? In brief, it states that every couple of years, the total number of transistors on something like a CPU will double. Yet look at how small our chipsets can be these days, what with how many billion transistors in/on them. So a smaller physical connector doesn't necessarily mean that it would be worse, because maybe we have twice as many connections built into the little connector than something that was older, and larger (for example, but not actually the case here).

The PE4L is currently the better option because it allows for 1.2Opt, which is the "20-333%" increase that you reference below. The 1.x Opt is basically data compression over the PCI bus; this is also the "bottleneck" I refer to previously, that my 1.1Opt is my bottleneck, not my GPU. If I had a PCI-e 2.0 compatible solution (working with HIT to get a solution still), then I wouldn't have any bottleneck at all.

As for my PM3N card, no, I did not have to cut a hole in my base cover, I just leave it off. I have a homemade cooling stand that currently directs (filtered by a thin cloth), and it blows air into the area that was previously protected by the base cover. I have not taken photos of my assembly yet, maybe I'll finally get to doing that tonight, and post them here (2nd post). Basically, the only thing that actulaly sticks out of my chassis is the mHDMI cable. That said, it doesn't disrupt anything by sticking out.

I'm going to start referring back to your first post of this thread as to the setup since... you already wrote it down and I probably got too excited and ran straight to replying. About your internal LCD test, "2Optimus tweak engages only when Intel 4500MHD/HD/HD3000/HD4000 primary graphics AND x1 link is detected, improving performance by 20-333%. Provides ability for the internal LCD to render games running on the external GPU."

Source: http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#firststeps

Shouldn't you have better performance on internal LCD?

As I mentioned above, the 20-333% increase is caused by the data compression over PCI through the Optimus. With my 1.1Opt I am probably getting something like 20-150% performance (just a reference value, not what it really is), and that is the bottleneck I spoke of earlier.

As for getting better performance on the internal LCD, this is not the case. Regardless of what eGPU solution anybody has right now, nobody will have better performance on their internal LCD. As it stands, the internal LCD will always have lower performance compared to using an external panel. I cannot explain the technicals behind it, mostly because I don't know all the details, but it has to do with Optimus and how it interacts with the iGPU and the data compression performed by the 1.x Opt. For example, with 1.1Opt I see horrible results on my internal LCD. While I know I will do much, much better with 1.2Opt on my internal LCD, I know also that I will still get way better results on my external monitor.

Now, this isn't to say that you still won't get way better performance over your GT 540M; any eGPU solution, even on internal LCD, still provides much better performance than our mid-range dGPU provides. It's just that, dollar for dollar, you are getting a better deal out of your eGPU if you're using it on an external monitor. There are some really cheap 22" 1080p LCDs at NCIX, $150 or less after taxes.

Link to comment
Share on other sites

Nope, my understanding is that all L502x (and probably L702x for that matter) have 2 mPCI-e slots, one is almost always occupied by the WLAN card, and the other is usually used by the TV Tuner/WWAN card if you ordered one. Otherwise, it sits free of anything.

There are guides here and there about modding a Xbox 360 power supply. Sorry but I can't point you to one because frankly I don't know exactly where they are! Your best bet is to ask in the main DIY Experience thread. It's definitely been done with much success by other people in the past, so we both know for sure that it's a viable option for eGPU. However, you should double check with people that the Xbox 360 mod will allow you to provide power over 2x 6-pin. This is a must if you want to get a GTX _60 or higher GPU, as they require much more power than the lower cards.

Oh no need to worry about the Xbox 360 mod. I found it earlier and had a glance at it. Would you happen to know about Sapphire Radeon HD 7870 GHz Edition OC 2GB PCI-E w/ DVI, HDMI, Dual DP at Memory Express vs MSI GeForce GTX 660 OC 2GB PCI-E w/ Dual DVI, HDMI, DisplayPort at Memory Express ? People say the 7870HD is better than the GTX660 OC. From this thread that Tech Inferno Fan linked, http://forum.techinferno.com/diy-e-gpu-projects/2747-12-5-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197 , "the HD7870 mostly surpasses the 1.2Opt performance." I'm not sure how that relates because he compared the cards when GTX 660 used no pci-e compression. Under http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#prepurchasefaq , #13 states the fastest 1.2Opt link is with a Nvidia Fermi or Keplar card.

For internal display, I could use this method: Laptop Forums and Notebook Computer Discussion - View Single Post - DIY eGPU experiences

Indeed, I read briefly that ThunderBolt eGPU solutions are kind of a two-sided coin: fantastic that you have such a high data rate solution, but also troublesome because it currently doesn't support some things that "older" hardware can handle. My guess is that the latter is purely because it is so new.

I didn't know that Optimus only works under x1 - x16; I assume ThunderBolt won't do Optimus then (speculation only from the aforementioned info) because x1-x16 is used to describe the PCI bus.

Well, this thread isn't going anywhere :) It'll only disappear if the entire forum goes down

Hahaha just have to wait till Thunderbolt hits the general market and maybe tests on eGPU will be more efficient... Maybe by that time, if I manage to learn enough, I might be able to help :D

I power my PE4H board with the desktop PSU. Steps #2 and #3 basically mean: #2 use PSU to provide power to the PCI board like you would normally provide power to a desktop motherboard, and #3 SWEX is like the on/off switch for the entire eGPU kit.

Oh so basically stick one of the pins there anyways. No need for 2 power sources.

No, connecting the "smaller side of the card" doesn't affect performance negatively. Don't think of the physical dimensions as having any impact on the overall performance. Ever heard of Moore's Law? In brief, it states that every couple of years, the total number of transistors on something like a CPU will double. Yet look at how small our chipsets can be these days, what with how many billion transistors in/on them. So a smaller physical connector doesn't necessarily mean that it would be worse, because maybe we have twice as many connections built into the little connector than something that was older, and larger (for example, but not actually the case here).

The PE4L is currently the better option because it allows for 1.2Opt, which is the "20-333%" increase that you reference below. The 1.x Opt is basically data compression over the PCI bus; this is also the "bottleneck" I refer to previously, that my 1.1Opt is my bottleneck, not my GPU. If I had a PCI-e 2.0 compatible solution (working with HIT to get a solution still), then I wouldn't have any bottleneck at all.

Yea so..the card works itself. The adapter + cord is a slight bottleneck but one card is better than the other.

As for my PM3N card, no, I did not have to cut a hole in my base cover, I just leave it off. I have a homemade cooling stand that currently directs (filtered by a thin cloth), and it blows air into the area that was previously protected by the base cover. I have not taken photos of my assembly yet, maybe I'll finally get to doing that tonight, and post them here (2nd post). Basically, the only thing that actulaly sticks out of my chassis is the mHDMI cable. That said, it doesn't disrupt anything by sticking out.

Whoa can't wait to see :D ~ Just wondering cause I don't have a cooling stand of any sort. If I were to put in a mHDMI cable, I was just thinking my laptop might have a bump created by the cable...and eventually the PM3N card will eventually snap or something.

As I mentioned above, the 20-333% increase is caused by the data compression over PCI through the Optimus. With my 1.1Opt I am probably getting something like 20-150% performance (just a reference value, not what it really is), and that is the bottleneck I spoke of earlier.

As for getting better performance on the internal LCD, this is not the case. Regardless of what eGPU solution anybody has right now, nobody will have better performance on their internal LCD. As it stands, the internal LCD will always have lower performance compared to using an external panel. I cannot explain the technicals behind it, mostly because I don't know all the details, but it has to do with Optimus and how it interacts with the iGPU and the data compression performed by the 1.x Opt. For example, with 1.1Opt I see horrible results on my internal LCD. While I know I will do much, much better with 1.2Opt on my internal LCD, I know also that I will still get way better results on my external monitor.

So I guess it's Nvidia/Optimus/iGPU that causes this. If it were a Radeon card, maybe it would be slightly different.. I'm not sure. I'm just trying to look for a good setup for my internal LCD screen. I'm thinking the memory interface (ie. 192bit,256bit) on a graphics card won't matter as much for PE4L/PE4H because of the transfer rate.

Now, this isn't to say that you still won't get way better performance over your GT 540M; any eGPU solution, even on internal LCD, still provides much better performance than our mid-range dGPU provides. It's just that, dollar for dollar, you are getting a better deal out of your eGPU if you're using it on an external monitor. There are some really cheap 22" 1080p LCDs at NCIX, $150 or less after taxes.

Trying my best to save $$$ while wanting to find a good solution for a new graphics card on a laptop. I guess that's why this forum exists LOL

Link to comment
Share on other sites

Oh no need to worry about the Xbox 360 mod. I found it earlier and had a glance at it. Would you happen to know about Sapphire Radeon HD 7870 GHz Edition OC 2GB PCI-E w/ DVI, HDMI, Dual DP at Memory Express vs MSI GeForce GTX 660 OC 2GB PCI-E w/ Dual DVI, HDMI, DisplayPort at Memory Express ? People say the 7870HD is better than the GTX660 OC. From this thread that Tech Inferno Fan linked, http://forum.techinferno.com/diy-e-gpu-projects/2747-12-5-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197 , "the HD7870 mostly surpasses the 1.2Opt performance." I'm not sure how that relates because he compared the cards when GTX 660 used no pci-e compression. Under http://forum.techinferno.com/diy-e-gpu-projects/2109-diy-egpu-experiences-%5Bversion-2-0%5D.html#prepurchasefaq , #13 states the fastest 1.2Opt link is with a Nvidia Fermi or Keplar card.

I've never been good at figuring out ATI vs Nvidia comparisons. I just stick with which ever one I feel is a better deal at the time. In my particular case, because I knew that getting internal LCD to work isn't as simple as it is with an Nvidia card, I stuck with the Nvidia line, and then found a card that fit my budget and fit my needs.

The 7870 sure looks pretty good, so I guess it's a matter of whether or not you want one brand or another. Benchmarks only tell half of the truth. The larger truth is no matter what card you get, there are always going to be ones that are more powerful, and weaker, than the one you chose. So just choose one that suits your use cases the best.

Any GTX 400, 500, or 600 card should give you 1.2Opt, or at least 1.1Opt if you're stuck with a PCI-e Gen1 eGPU, which are all Fermi or higher. I don't know if there are any "limitations" with ATI cards in the same sense of Nvidia cards requiring Fermi generation or higher.

Whoa can't wait to see :D ~ Just wondering cause I don't have a cooling stand of any sort. If I were to put in a mHDMI cable, I was just thinking my laptop might have a bump created by the cable...and eventually the PM3N card will eventually snap or something.

There is some flex in the PM3N card due to the thickness of the mHDMI cable head. It's enough to make it bend slightly if you move the mHDMI cable around, but once you set it down the flex isn't apparent anymore.

I haven't taken any photos of my full eGPU setup because I'm in the process of putting it into an old Shuttle XPC case I just picked up for free (dead system, but case in excellent condition). The dimensions are just right, so I'm excited to re-purpose the Shuttle case to hold my PSU + eGPU.

So I guess it's Nvidia/Optimus/iGPU that causes this. If it were a Radeon card, maybe it would be slightly different.. I'm not sure. I'm just trying to look for a good setup for my internal LCD screen. I'm thinking the memory interface (ie. 192bit,256bit) on a graphics card won't matter as much for PE4L/PE4H because of the transfer rate.

Well, i wouldn't completely attribute this to being an issue with Optimus. I mostly attribute it to being an issue with my 1.1Opt. I can almost guarantee you that even with an ATI card you will see heavily degraded performance than if you were on an external monitor. It even shows on the link you provided for using Virtu with ATI cards: RE5 on external = 140 fps, RE5 on internal = 60 fps.

I think until eGPU technology improves, internal LCD performance will not be as good as external. Oh well!

Link to comment
Share on other sites

For those reading this thread, I've added photos of my enclosure. It's not the final product, but that'll do for now, until I have tools to cut out holes in the enclosure's sleeve (to make space for the mHDMI cable) and drill holes in the base to screw the PSU into place.

The enclosure has lots of space inside of it, so it makes it very easy to tuck away the cables when I need to pack it up. It is literally a single box that encloses all the necessary hardware.

Link to comment
Share on other sites

  • 2 weeks later...

Hey.

Great work. I have almost the same laptop and Im going to do a egpu setup with a gtx 660. But I dont understand exactly what would happen if I dont use setup 1.x. to disable the dgpu. Would it work at all? And is it maybe possible to disable the dgpu via bios?

Link to comment
Share on other sites

Hey.

Great work. I have almost the same laptop and Im going to do a egpu setup with a gtx 660. But I dont understand exactly what would happen if I dont use setup 1.x. to disable the dgpu. Would it work at all? And is it maybe possible to disable the dgpu via bios?

---

Im about to get started on my own egpu. I already ordered a pe4l 2.1 and a gtx 660. I have a dell Xps 15 l502, with a gt540m dgpu.

As I understand it I definitly need setup 1.x to get it to work. But what would happen if I tried it without setup 1.x. Would it not work at all or would I just miss optimus?

It might work if you can get Win7 to allocate the eGPU and you hotplug in to overcome mPCIe whitelisting issues. However, the gt540M would be assigned the Optimus features, so the gtx660 would run in x1-only mode. There you'd miss out on Optimus internal LCD screen mode AND pci-e x1 compression which *greatly* accelerates DX9 and somewhat DX10. To get the greatly desired performance boost requires the gt540m to be disabled.

From my PM discussion with daver160 we found the neither the stock nor modified bios could disable the dPGU. The modified one does give a PEG option which didn't do anything. So daver160 resorted to using Setup 1.x to successfully disable the dGPU.

Setup 1.x, when automated, presents as a Win7 bootmenu item so when you want the eGPU you just hit that item, it does it's thing and chainloads back to the Win7 bootmenu where you select Win7. It adds like 1.5s to the whole bootup time.

  • Thumbs Up 1
Link to comment
Share on other sites

It might work if you can get Win7 to allocate the eGPU and you hotplug in to overcome mPCIe whitelisting issues. However, the gt540M would be assigned the Optimus features, so the gtx660 would run in x1-only mode. There you'd miss out on Optimus internal LCD screen mode AND pci-e x1 compression which *greatly* accelerates DX9 and somewhat DX10. To get the greatly desired performance boost requires the gt540m to be disabled.

From my PM discussion with daver160 we found the neither the stock nor modified bios could disable the dPGU. The modified one does give a PEG option which didn't do anything. So daver160 resorted to using Setup 1.x to successfully disable the dGPU.

Setup 1.x, when automated, presents as a Win7 bootmenu item so when you want the eGPU you just hit that item, it does it's thing and chainloads back to the Win7 bootmenu where you select Win7. It adds like 1.5s to the whole bootup time.

As Nando already stated, our BIOS, even the modified ones by capitankasar (from NBR) cannot actually disable the dGPU. The GT540M is basically on "all the time", in the BIOS we can only specify whether we want to boot Windows with the iGPU or the dGPU.

There is a setting for changing the TOLUD value in the modified BIOS, but Windows 7 (haven't tried Win 8) does not respect this value. For example, by default my TOLUD value was set to 3GB in the BIOS, and Windows 7 recognises this (as evidenced in the Device Manager). However, changing the TOLUD in the BIOS to values like 2.5GB and 3.5 GB did not show in Windows 7; the device manager in Windows 7 always reported a value of 3GB.

Without any tinkering, I don't think one with our laptop models (L501/L502) would be able to get the eGPU working without first disabling the dGPU. At least for me, I absolutely have to use Setup 1.x to get my eGPU going.

  • Thumbs Up 1
Link to comment
Share on other sites

I hope you, daver160 or Tech Inferno Fan could help me out here.

I finally got all components for my egpu setup. My setup differs a bit from daver160s. I got the same laptop, but with the i7 2820qm and Windows 8, the PE4l adaptor, for Pci 2.0 support and a GTX 660 as egpu.

I set it up today, but I couldnt get it to work.

1. Preparing the machine for eGPU this worked, after some trying.

2. Installing and setting up eGPU components it basically worked too, but if I boot up my notebook with the egpu powered up and plugged in. It doesn't start. I just got a black screen. It just works if I plug in the egpu in sleep mode.

3. Running Setup 1.x Didn't work at all. I had the same problem like daver160. It just freezes if I try to disable the dgpu. So I tried to switch out the startup.bat and then I did not now how to go on.

My startup.bat now looks like this:

setpci -s 1:0.0 COMMAND=0:7 10.l=0,0,0,0,0,0 -s 0:1.0 b0.w=10:10 19.b=0,0 3E.w=0:8 COMMAND=0:7 20.l=0,0call iportbus force

call iport g2 1

call vidwait 60 10de:11c0

call vidinit -d 10de:11c0

call pci

call chainload mbr

Maybe daver160 you could provide a step by step guide how you did it.

Thanks for your help.

Link to comment
Share on other sites

I'm at work right now, so need to be brief, but I'll address these questions best as I can for now. I'll go into further detail when I get home and can remember what I did

2. Installing and setting up eGPU components it basically worked too, but if I boot up my notebook with the egpu powered up and plugged in. It doesn't start. I just got a black screen. It just works if I plug in the egpu in sleep mode.

I've actually not been hotplugging it while Windows is in sleep mode. My Method is to have the eGPU off, boot up the machine, boot into Setup 1.x, then power on the eGPU.

I, too, have the weird issue where I g et a black screen with the eGPU powered on and plugged in (if I don't boiot and chainload from Setup 1.x). I found that turning on and plugging in the eGPU after booting fixes that issue.

3. Running Setup 1.x Didn't work at all. I had the same problem like daver160. It just freezes if I try to disable the dgpu. So I tried to switch out the startup.bat and then I did not now how to go on.

Have you tested to make sure that Windows detects your eGPU?

Power on your laptop, then power on and plug in your eGPU, and boot into Windows normally. Your Device Manager should detect your eGPU so that you have 3 display adapters. Your eGPU GTX 660 should be recognised by have an Error 12, which is a good sign.

Now, as for the startup.bat, not sure if it was just a typo, but make sure each call is on its own line, like so:

setpci -s 1:0.0 COMMAND=0:7 10.l=0,0,0,0,0,0 -s 0:1.0 b0.w=10:10 19.b=0,0 3E.w=0:8 COMMAND=0:7 20.l=0,0
call iportbus force
call iport g2 1
call vidwait 60 10de:11c0
call vidinit -d 10de:11c0
call pci
call chainload mbr

Don't forget to make sure that your "10de:11c0" is in fact the one listed in the Setup 1.x screen. For example, when you boot up into Setup 1.x, it will detect your iGPU, dGPU, and eGPU. Make sure that you are using the address ID that Setup 1.x sees for your eGPU otherwise it'll be looking for something that's just not there!

Link to comment
Share on other sites

I've actually not been hotplugging it while Windows is in sleep mode. My Method is to have the eGPU off, boot up the machine, boot into Setup 1.x, then power on the eGPU.

Thanks daver 160. This was the point i was missing. Now setup 1.x detects the egpu almost everytime. But not always.

Have you tested to make sure that Windows detects your eGPU?

Power on your laptop, then power on and plug in your eGPU, and boot into Windows normally. Your Device Manager should detect your eGPU so that you have 3 display adapters. Your eGPU GTX 660 should be recognised by have an Error 12, which is a good sign.

This works perfectly.

Don't forget to make sure that your "10de:11c0" is in fact the one listed in the Setup 1.x screen. For example, when you boot up into Setup 1.x, it will detect your iGPU, dGPU, and eGPU. Make sure that you are using the address ID that Setup 1.x sees for your eGPU otherwise it'll be looking for something that's just not there!

The startup.bat you posted works for me and 10de:11c0 is exactly as listed in Setup 1.x.

I altered call chainload mbr to call grub4dos mbr.

It works both ways, but with the chainload one I get an errormessage.

If I run the startup.bat and chainload to win, the dgpu is disabled, but I still get code 12. I think this is related to the pci compression. I tried several options. With some it works, but there is always the code 12. I hope you can help me out here.

I got 8 gig of ram and even if I change it to 4 gig it doesnt work.

Link to comment
Share on other sites

It works both ways, but with the chainload one I get an errormessage.

If I run the startup.bat and chainload to win, the dgpu is disabled, but I still get code 12. I think this is related to the pci compression. I tried several options. With some it works, but there is always the code 12. I hope you can help me out here.

I got 8 gig of ram and even if I change it to 4 gig it doesnt work.

There's a problem with compaction if you do a dGPU[off] followed by PCI compaction. Instead, do set ignore [dGPU], do a selectively 32bitA compaction selecting eGPU or iGPU+eGPU. In this way the PCI compaction is performed as if the dGPU wasn't there.. which is exactly how it appears in your startup.bat if there is 'call dGPU off' prior to the 'call pci'.

Link to comment
Share on other sites

  • Brian featured this topic

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.