Jump to content

hishamkali

Registered User
  • Posts

    18
  • Joined

  • Last visited

Posts posted by hishamkali

  1. As far as PSU power consumption, the HDPlex PSU is specced up to 400w peak power consumption, which I imagine it achieves using the large bank of capacitors attached to the PCB. I have a kill-a-watt power meter, and I used it to measure the power consumption. All power consumption is measured at the wall, and thus includes power for the AKiTio board and attached fan. I believe the HDPlex advertises 94% efficiency. So I would expect the actual GPU power consumption to be about 90% of the results given below.

    At 100% TDP in Furmark (worst case scenario), the card passed the 15min burn in test without issue. Power consumption was 200w. Peak temperature for the GPU was 83 degrees Celsius with a final fan speed of 70%.

    While running the Crysis 2 benchmark using the 'extreme' preset, the highest power consumption I saw was 178w, and then only briefly. I would say that 150w would be typical during gaming.

    Here is a picture of the setup on my desk:

    post-29007-1449499953771_thumb.jpg

    As far as the case, I received my extra acrylic in the mail today. The idea would be for the laptop to sit on top of the acrilic case, which would have no air intakes on top of it. It would identical to the laptop sitting on the desk. There would be a 120mm intake on the bottom of the acrylic case, which I could suspend above the desk using the rubber feet from the AKiTio enclosure. Similarly, I could install the PSU upside down such that air would be drawn in from the underside, as opposed to fighting the laptop for air.

    The laptop cpu gets pretty hot if I run prime 95 on all 8 threads, quickly throttling down to about 2.7GHz with a temp of 90 degrees Celsius. The 4 core turbo speed is 3.5GHz. I set the turbo time limit to basically infinity (max number of seconds), and I also raised the CPU current limit to 100A and the TDP from 58W to 63W. I will try a repaste with some MX-4 I have handy while I have the computer apart to install my new IPS screen. Maybe I can also drive the extra laptop lcd externally to create a mobile dual monitor setup when I need it...

    • Thumbs Up 4
  2. I hope the information provided here helps others who want to try a Windows based Thunderbolt eGPU. Cost breakdown:

    HP Zbook 15 G2 : $1600 from PCNation or BH Photovideo EVGA GTX 970 : $350 AKiTio Thunder 2: $229 HDPLEX 250W : $85 Dell PA-9E 240w : $60 on eBAY (I had one from my previous laptop) HP 230w Adapter : (Also works, slightly cheaper) $25 - $30 on eBAY Acrylic : $25 Brackets, Screws: $15

    eGPU GPU + Adapter + Enclosure + PSU Total: $763

    Laptop + 32GB DDR3: $1900

    Total System Cost : $2663

    Overall not bad for a highly capable, expandable mobile workstation. I use this thing for my work, and I hadn't really had a new computer for 6 years, so it made sense to get something that just worked and had a good warranty.

    I ran a bunch of benchmarks as well. Results are included in the table below. The full spreadsheet containing all of my eGPU Benchmarks across all systems to date can be found at the following link: [URL="https://www.dropbox.com/s/bt9igxw7uy9yiql/Comprehensive_Testing.xlsx?dl=0"] eGPU Testing Excel Sheet[/URL]

    Benchmark Configuration:
    OS: Windows 8.1 Professional x64
    CPU: Intel i7 4710MQ @2.5 GHz
    CPU COOLING: HP Zbook 15 G2 Stock Cooling
    RAM: 32.0GB DDR3 1600 MHz
    eGPU: EVGA GTX 970 4.0GB AC 2.0
    eGPU Clocks: Core Clock 1165 MHz, Memory Clock 1753 MHz, Boost Clock 1317 MHz




    Benchmark Results:
    TEST TYPE x4.2 Int Screen Result x4.2 Ext Screen Result Percent Difference from Int to Ext Screen
    PCIe Speed Test Computer to Card Bandwidth, MB/s 1266 1263 0.24%
    PCIe Speed Test Card to Computer Bandwidth, MB/s 1373 1369 0.29%
    PCIe Speed Test Bidirectional Bandwidth, MB/s 1845 1845 0.00%
    3D Mark 06 3D Marks 25497 28052 -9.11%
    3D Mark 06 SM 2.0 Score 10378 11460 -9.44%
    3D Mark 06 SM 3.0 Score 11656 13296 -12.33%
    3D Mark 06 CPU Score 7177 7335 -2.15%
    RE5 DX9 1280x800 Variable FPS 186.1 237.5 -21.64%
    RE5 DX9 1280x800 Fixed FPS 128.2 154 -16.75%
    3D Mark Vantage Score 29383 32228 -8.83%
    3D Mark Vantage Graphics 31987 36695 -12.83%
    3D Mark Vantage CPU 23616 23606 0.04%
    RE5 1280x800 DX10 Variable FPS 187.6 230.6 -18.65%
    RE5 1280x800 DX10 Fixed FPS 122.4 136.1 -10.07%
    3d Mark 11 Score 10313 10775 -4.29%
    3D Mark 11 Graphics 12007 12507 -4.00%
    3D Mark 11 Physics 7490 7620 -1.71%
    Uniengine Heaven 4.0 Extreme Preset 720p Score 1796 1890 -4.97%
    Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS 71.3 75 -4.93%
    Uniengine Heaven 4.0 Extreme Preset 720p Max FPS 153.6 164.9 -6.85%
    Uniengine Heaven 4.0 Extreme Preset 720p Min FPS 25.3 25.8 -1.94%
    Guild Wars FPS (Eye of North Outside) 331 650 -49.08%
    COD MW2 (Opening, looking towards trainees) 61 75 -18.67%
    DMC4 DX9 Scene 1 FPS 275.81 440.89 -37.44%
    DMC4 DX9 Scene 2 FPS 261.63 368.94 -29.09%
    DMC4 DX9 Scene 3 FPS 261 346.5 -24.68%
    DMC4 DX9 Scene 4 FPS 200.98 235.17 -14.54%
    Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS 78.2 88.2 -11.34%
    COD MW2 (Opening,looking opposite trainees) 165 460 -64.13%
    3D Mark 13 Fire Strike 7649 8407 -9.02%
    3D Mark 13 Fire Strike Graphics 8525 9749 -12.56%
    3D Mark 13 Fire Strike Physics 9045 9253 -2.25%
    3D Mark 13 Sky Diver 18043 21088 -14.44%
    3D Mark 13 Sky Diver Graphics 23605 30277 -22.04%
    3D Mark 13 Sky Diver Physics 8408 8662 -2.93%
    3D Mark 13 Cloud Gate 17236 20565 -16.19%
    3D Mark 13 Cloud Gate Graphics 33176 53386 -37.86%
    3D Mark 13 Cloud Gate Physics 6428 6525 -1.49%
    3D Mark 13 Ice Storm 39087 126811 -69.18%
    3D Mark 13 Ice Storm Graphics 38479 265629 -85.51%
    3D Mark 13 Ice Storm Physics 41376 44824 -7.69%


    3dmark benchmark links

    3dmark-FS.GPU=9749 : [url]http://www.3dmark.com/fs/4052075[/url] 3dmark11.GPU=12507 : [url]http://www.3dmark.com/3dm11/9424206[/url] 3dmark06=28052 : [url]http://www.3dmark.com/3dm06/17731211[/url] RE5_DX9_1280x800 = 237.5
    • Thumbs Up 5
  3. So I have made a good bit of progress in the last few weeks. I believe I have ironed out most of the kinks and have a reliable setup. I have also made some upgrades and designed and constructed a Mark 1 version of an enclosure

    post-29007-1449499952519_thumb.jpg

    Current Laptop Specs

    HP Zbook 15 G2

    OS : Windows 8.1 Professional x64

    Screen: 15 Inch 1080p eDP Internal Screen (Upgrading to IPS 1080p later this week)

    CPU : Intel i7 4710MQ (Overclocked + 200MHz to 3.7GHz Turbo Single Core using Intel XTU)

    RAM : 32.0 GB DDR3 1600MHz

    HDD : 256GB Crucial M500 SSD + 1.0 TB 7200rpm HDD in Optical Bay

    iGPU : Intel HD 4600

    dGPU : AMD Firepro M5100 2.0GB GDDR5

    eGPU Hardware:

    GPU : EVGA GTX 970 SC ACX 2.0 4.0GB GDDR5

    Adapter : AKiTio Thunder2 PCIe Expansion Box + Molex to Barrel Adapter

    Connection : 16gbps x4.2 over Thunderbolt 2

    Power Supply: HDPLEX 250w Pico PSU + Dell PA-9E 240w Laptop Power Adapter

    eGPU Enclosure:

    Custom built laser cut acrylic, first pass.

    Molex to Barrel Adapter:

    post-29007-14494999525548_thumb.jpg

    Preliminary Hardware Setup:

    post-29007-14494999526246_thumb.jpg

    Software Setup:

    I experimented a lot to get a working setup. As mentioned earlier, Windows 7 gave code 12, which to my knowledge has not been resolved as of yet. I made the jump to Windows 8.1, and everything seems to work plug and play, with the exception of Nvidia Optimus. Windows 8.1 is actually very nice with 'start is back' installed. Liking it so far.

    Starting from a clean installation of Windows 8.1 Pro, I first installed all HP drivers with no eGPU connected. The only exception was the HP provided driver for the M5100 and Intel HD4600. Optimus doesn't work if you use those drivers. Instead, I first installed the standard Intel HD 4600 drivers from the Intel website. Then, I used the standard Windows 8.1 drivers for the M5100, resulting in it being detected as a 'R9 270M'. I made sure not to install the catalyst control suite, as that seemed to interfere with optimus.

    Afterwards, I disabled my dGPU in device manager by disabling the PCI bridge above it. This system topology can be accessed in device manager by selected 'view devices by connection'. The specific device was 'Intel® Xeon® processor E3-1200 v3/4th Gen Core processor PCI Express x16 Controller - 0C01'.

    Once I had this device disabled, I rebooted the computer, completely removing the AMD Firepro M5100 from the pci bus in windows. Afterwards, I grabbed the latest NVIDIA drivers from their website, turned on my eGPU, and plugged the thunderbolt cable in. After the device was initialized by Windows, I installed the NVIDIA drivers using the built in setup. After a reboot, everything seemed to be working, and as an added bonus, both optimus and Physx were working! Previously, the same card on a DELL M6500 with a built in Firepro M7820 had Physx disabled due to NVIDIA locking out the functionality if an AMD GPU was present. No such issue with this computer owing to the built in Intel iGPU.

    Optimus Working:

    post-29007-14494999526642_thumb.jpg

    When the eGPU is disconnected and I take the computer to my office, I just re-enable the pci express controller, and I am able to use my built in GPU. Ideally, I'd like for it to be detected as a M5100 as the Firepro drivers would be useful for CAD, but one can't have everything. The eGPU still works if the dGPU is active, but I can only accelerate the external screen, as optimus will not activate with the AMD dGPU on the PCI bus. I don't really need to use the internal screen all that often as I have a 23 inch monitor, but it is useful for traveling, which I do often.

    That brings me to my next point. With everything working, I set out to make a good looking, functional enclosure that would provide adequate ventilation while being as compact as physically possible. I have access to a laser cutter, and I purchased some acrylic to get started. I wanted to also make as much use of the provided AKiTio enclosure as possible, as that seemed to provide the best protection for the PCB. The problem is that it is too short for a standard GTX 970, which is remedied with a hacksaw. The AKiTio enclosure (or what remains of it), is attached to the acryclic by screwing M3 screws through the bottom into the other side of the PCB standoffs. Build pics to follow.

    Enclosure Design for Laser Cutting:

    post-29007-14494999527691_thumb.png

    Laser Cutting:

    post-29007-1449499952844_thumb.jpg

    Cutting off the End of the AKiTiO Enclosure with a Hacksaw:

    This was not very easy and involved a lot of manual labor. My advice is to wear gloves and get extra hacksaw blades if necessary. First, cut off the small end pice connecting the back panel of the AKiTio enclosure. Then, bend the back panel away and begin your cut. Being patient helps a lot.

    post-29007-14494999529454_thumb.jpg

    After some effort, victory is mine!

    post-29007-14494999529792_thumb.jpg

    Preparing and Installing the AKiTio PCB

    Since I was already voiding my warranty, I decided to solder 12V power directly to the AKiTio PCB:

    post-29007-14494999530453_thumb.jpg

    Threading L-brackets for attachment of a cover later:

    It just so happened that these corner braces already had 1/4 inch holes. I used a tap and drill to cut 20 threads per inch into the steel:

    post-29007-14494999530784_thumb.jpg

    Assembly Time:

    It's pretty cramped inside the case, closeup view of the HDPLEX PSU Board included. I had to break my initial attempt at a front panel because the holes I put in for the TB cable were not big enough (the cable is thicker than the connector by a millimeter or so).

    post-29007-14494999532232_thumb.jpg

    Preliminary Testing without Cover (Thermal Stress Test):

    I set the eGPU up running Uniengine Valley overnight to make sure it was stable. Seemed to check out. Max temps were similar to what i saw in open air testing. Not too concered here.

    post-29007-14494999531845_thumb.jpg

    Adding a cover:

    I initially only cut a rectangular cover, but I ended up drilling holes for the mounting bracket screws and vent holes. This was pretty time consuming. For the vent holes, I simply drew a rough grid on the back using a sharpie, and drilled pilot holes through those points. I then went on the other side with a larger drill bit for the actual holes. Drilling acrylic requires care to avoid cracking. I think if I do this again, I will simply include the holes in the laser cutter file, as that is really the only way to get them perfect without chipping. Pictured next to the laptop for size.

    post-29007-14494999532935_thumb.jpg

    All Hooked Up:

    post-29007-14494999533632_thumb.jpg

    Optimus Active:

    post-29007-14494999533972_thumb.jpg

    In the future I'l like to build a better eGPU case, perhaps one with room for the laptop psu inside of it and a hinge. The idea would be for the laptop to sit on top of the case. That would also allow me to put a 120mm fan with a bottom intake in, allowing for better airflow while the laptop is perched on top. Got some more acrylic coming in and will give it a go when I have some downtime.

    post-29007-14494999525891_thumb.jpg

    post-29007-14494999526998_thumb.png

    post-29007-1449499952811_thumb.jpg

    post-29007-14494999528788_thumb.jpg

    post-29007-1449499952911_thumb.jpg

    post-29007-1449499953013_thumb.jpg

    post-29007-14494999531147_thumb.jpg

    post-29007-14494999531486_thumb.jpg

    post-29007-14494999532585_thumb.jpg

    post-29007-14494999533287_thumb.jpg

    • Thumbs Up 1
  4. Hi all,

    I recently purchased a HP Zbook 15 G2 with the following specs:

    OS: Windows 7 x64 and Windows 8.1 Pro x64

    CPU: i7 4710MQ

    RAM: 8.0 GB DDR3 (Will swap out to 32GB DDR3)

    iGPU: Intel HD 4600

    dGPU: AMD Firepro M5100

    Ports of Interest: TB2, Express Card 54

    eGPU Hardware:

    -EVGA GTX 970 SC

    -AKiTio Thunder2 Pci Express Expansion Box

    -Powered PCI Express x16 Riser

    -Corsair CX430 ATX PSU, powered on by SWEX

    -Soldered Molex to Barrel Adapter for AKiTio Expansion Chassis

    So I connected everything together, ensuring that my soldered molex to barrel adapter was wired correctly, and got a code 12 on Windows 7. I have not been able to overcome the code 12 issue in my limited testing. After being advised by Nando to switch to Windows 8.1, the eGPU worked perfectly via plug and play. It was not necessary to uninstall the dGPU drivers.

    The eGPU appears to be working properly in Windows 8.1 so far, giving me a score of about 28400 in 3DMark06. However, in running the Heaven 4.0 benchmark in extreme mode, I got some black screen disruptions. I think the eGPU was disconnecting itself, as I heard the Windows new hardware ping when the benchmark resumed. Amazingly, it didn't crash, but there still must be some instability there. I will note that the powered riser is currently not connected to the PSU for a lack of molex connectors available, I will try connecting it once I acquire a molex splitter. Also, the Heaven 4.0 score at 1080p Extreme was somewhat lower than what I have seen for this card running at 8gbps x4.1 with a slower cpu (1066 vs 1200). The 1200 score was on Windows 7 though, and others have noted lower benchmark scores on Windows 8.1 in general.

    I have not been able to enable Nvidia Optimus yet for rendering on the internal display. So far, disabling the built in dGPU results in code 43 on the Intel HD4600. I may have to try completely uninstalling the AMD software / iGPU drivers and starting with iGPU drivers straight from intel.

    I tried running a CUDA Bandwidth Test and Obtained the Following Numbers:

    HtoD: ~1250 MiB/s

    DtoH: ~1350 MiB/s

    The above would seem to indicate that I am achieving an x4.2 link. GPUz indicates that as well.

    I will be updating this thread as I have more time to test. Would like to make this an integrated box with an SFX PSU.

    Pictures of the setup can be found at the following dropbox link until I can size them down somewhat:

    • Thumbs Up 2
  5. http://forum.techinferno.com/implementation-guides/7388-%5Bguide%5D-15-lenovo-w540-r9_290x-gtx780ti@10gbps-4gbps-sonnet-ee-se2-pe4l-2-1b.html

    Working fine, but the built quality and no trackpoint mouse Buttons .... :/. I would choose the M3800 or wait for W541.

    I found a link with possible pricing information on the W541. Not too cheap. Hoping that actual direct to consumer pricing will be a little lower.

    Search - PC Connection

  6. Or wait some time to get a W541 with TB2.

    So the principal issue involved in me getting the w541 is the timing. I have no idea when it will be released, nor do my university IT folks who purchase through Lenovo.

    The reason I am in the market for a new computer is that my Dell M6500 which I had been using in x4 mode with the PE4H and GTX 970 had its motherboard burn out. It was performing on par with TB1 speeds, as verified by CudaZ, but was a mess of wires. I'm not sure what happened, but something must have gotten bumped in the wrong way while the computer was on. I thought it was just a bad connection, but proper and careful reconnection also burned out my back up motherboard. I'm typing this on my backup laptop, and a new motherboard will be here on Friday....

    I've already ordered an Akitio TB2 Pcie expansion box and powered riser cable. I have a corsair CX430 I'll be using to power it all. So I'm more or less invested in buying a new laptop that supports thunderbolt, but it also oriented for Windows.

    I get a discount on the w540 through my university for a total of 32% off until tomorrow, resulting in a price of $950 for the starting config. I already have 32GB of DDR3, and the w540 has 4 DIMM slots to drop the RAM into. Also, it has an optical bay slot I can use for a second hard drive. Plus, it supports much faster cpus than the M3800 (one option). I believe you can go up to a i7 4930mx for an additional $600 or so including the 32% discount, which if there is no throttling, is fantastic due to the unlocked multiplier and throttlestop. I also want to run fully coupled numerical simulations for work on this computer when I want, so that aspect is important to me.

    In addition, another person in my lab just received a w540 as their main computer, so I may be trying it out with theirs once I receive my akitio box in a day or so.

    The M3800 on the other hand has full support for TB2 instead of TB1 and is available right now. It has the advantage of touch screen, much much lighter/thinner profile, and aluminum instead of a full plastic body like the thinkpad. Unfortunately, it does not have a removable battery, though it is configurable with up to a 91whr battery. The 91whr battery option apparently removes the 1 option for a 2.5in HDD, so you would have to make do with the mSATA slot. No custom fancy docking station connector here, but there appears to be a nifty USB 3.0 docking station (though I'm not sure how that actually works with regards to video, etc.). The only disadvantage is the soldered in, slightly slower, processor, and 2 DIMM slots for a max of 16GB of RAM. My lab is in the process of buying me a pretty powerful workstation for the office, so the ram issue may be moot.

    The M3800 configuration I found was cheapest by swapping windows with linux for a savings of $100 (get windows through my university), and a 5% off coupon I found on the internet for precisions. Final price of $1450 (8GB of RAM, 500GB HDD).

    The w541 appears to combine the best of both worlds, but if it's not available soon (tried to email Lenovo sales, no response yet), I may have to pass for the M3800, so tempting to pull the trigger on an already available product.

    TL;DR: w540 vs M3800 = Battle between workstation vs. ultrabook, TB1 vs. TB2, 32GB RAM vs. 16GB RAM, Expandability vs. Ultimate Portability. Mainly care about TB2, Followed by CPU, Followed by Portability, Followed by RAM. Does anyone own one of these machines and can attest to its feasibility for egpu use?

  7. Any idea whether this laptop would enable Nvidia Optimus on the internal display over thunderbolt 2? I imagine you would have to disable the internal Nvidia Quadro by using setup 1.30 or similar. Need a new workstation laptop and already have an EVGA GTX 970. Trying to decide between this and the (much cheaper) Lenovo Thinkpad w540, which is on sale now for less than $1000.

  8. So I am chugging away at finally getting a successful egpu implementation that I am happy with. I currently have four laptops, one Dell E6500, two Dell M4400s, and now one Dell M6500. The E6500 and M4400s will be given to my family back home as they need the computers, but in the meantime they have been useful testbeds.



    First the specifications of the M6500

    Internal Display: 17" WUXGA

    CPU: i7920xm @2.00 GHz (I can take it way past 3.0 GHz, but leaving it stock for now to work on egpu)

    RAM: 32.0 GB DDR3 1333 MHz

    HDD1: 256GB Crucial M4 SSD

    HDD2: 1.0TB WD 7200RPM

    dGPU: AMD Firepro M7820 (Workstation equivalent of m5870)





    PCI Express Port Layout:

    Port 1: WWAN Port (Empty)

    Port 2: WLAN Port (Occupied by Wi-Fi card Normally, currently removed)

    Port 3: Bluetooth Port

    Port 4: Express Card

    Port 5: MSATA Slot???? (It's wired for pci express according to the motherboard schematic, but I was unable to detect anything plugged into that port).

    Port 6: Broadcom Ethernet

    Port 7: USB 3.0

    Port 8: Nothing





    TOLUD for this system is set at 3.50GB, meaning we will need a DSDT Override.



    Operating Systems Under Review (Not my main PC yet, so I am swapping disc images trying to get things to work)

    Windows 7 Pro x64

    Windows 8.1 x64



    eGPUs Under Testing:

    eGPU1: Galaxy GTX 460 768mb

    eGPU2: EVGA GTX 970 ACX 2.0



    eGPU Hardware:

    Adapter: PE4H v2.4b with 3PM3Ns and One EC Adapter for up to 8Gbps bandwidth (x4.1 Link)

    PSU: Corsair CX430

    HDMI Cables: Cable Matters 3 Feet mHDMI to HDMI Cable + Cable Matters mHDMI Male to HDMI Female

    Enclosure: Laser cut particle board enclosure. Will fully assemble once eGPU Verified Working

    PM3N Enclosure: Old laptop DVD drive case emptied and dremeled. 3D printed bezel to match 3 HDMI female ports coming from PM3N





    So, like any good project, there are numerous issues to overcome. I will try my best to summarize my efforts thus far:



    Windows 7 Pro x64:



    Installed OS, Drivers from Dell Website, Latest Catalyst Driver Suite from AMD.



    Due to the high TOLUD = 3.5 GB, I had to do a DSDT override to allow for egpu. I followed some of the guides available, and used syntax similar to Avlan's DSDT Override. I actually modified his code so that I would achive a 56.25GB endpoint like everyone else. The old DSDT Override for Dell Precision / Latitude laptops basically adds 8GB of address space after your RAM, so if you changed the RAM size, it would change your endpoint. Rather simple to fix using the DSDT editor.



    Once I had the DSDT fixed, I was ready for an eGPU. I put the laptop in standby, and connected the GTX 970 via express card only. Upon resume, Windows found new hardware. I canceled any driver installs and installed the latest drivers from Nvidia's website. It then prompted me to reboot.



    Upon rebooting, my issues began. Number one, I noticed that the Dell M6500 does not like the GTX 970 being connected upon bootup. If it sees the GTX 970 on bootup, it kicks the dGPU out of it's pci allocation in some weird quasi disabled state (still on the pci bus though), and puts the GTX 970 in its place (or so I think, haven't checked with 32bit mem map. The end result is that I boot with no Dell BIOS splash screen and Windows 7 Loads using the Vista style loading screen (no nice colorful Windows logo). Once I boot into Windows, I am greeted with a 16bit 640x480 screen courtesy of the code 12 error against my dGPU. This being Windows 7, I am unable to disable my dGPU and have the eGPU active at the same time (need Windows 8 for that, will try soon).



    I can avoid this issue by simply doing the standby resume method. The GTX 970 comes alive and works great! I was able to collect a full suite of benchmarks using the GTX 970 and an x1.1 link on express card in this fashion. I can also achieve the same result by halting Windows startup with F8 and connecting the eGPU then. What does not work, however, is any form of compaction in Setup 1.30 with the GTX 970 attached. Doesn't matter if I boot up with the GTX 970 attached with degraded dgpu or hotplug it after booting into setup 1.30. Without the eGPU, I am able to load either a disc image or usb image of setup 1.30, and move my dGPU wherever I please (32bit or 56.25GB endpoint 36bit). If it's not in the 32bit space, I get the Vista style loading screen, but everything works properly in Windows. So I know I can compact and chain load with no egpu attached.



    The problem is that when I try to compact with the eGPU attached, I get weird issues. If I am using a USB disk, the disc image freezes after any compaction unless no solution is found. Sometimes, I get an error message that says 'could not find fdconfig.sys' (using settings that would be compatible if it were just the dGPU). If I am using a disk image installation, I simply sometimes get a corrupted disc image (I assume SATA support is more native than USB). I think I may know what could be causing this though. This dell laptop has three internal HDD slots (including the mSATA one), as well as an eSATA port on the laptop and an eSATA port on the dock for a total of 5 SATA ports. I have read reports online that other people with these laptops have complained of the BIOS essentially shuffling around these disc indexes as it pleases which would wreak havoc on file writes for the OS or chainloading since that's what it relies on. For example, the primary hard disk could have an index of 0 at one point, then after compaction could have an index of 1. Compaction would then go to whatever is occupying the index of 0 and try to continue reading / writing files, resulting in the freeze after the pci write.



    Other strange things I have noticed: sometimes, when hotplugging the egpu in setup 1.30 and waiting for detection, a hardware ID is detected, but it isn't the right one. It's not event the HD Audio hardware ID. The right one for this card is 10DE:13c2. I have seen 10DE:13C0 (I think this is a GTX 980?) and 10DE:0008 (a very, very, very old card's hardware id, I believe a GeForce MX420). I almost wonder if there are some quirks in the VGA BIOS.



    I got this card working at an x4.1 link with moderate success on my Dell E6500 with C2D x9100 CPU and an Intel 4500MHD igpu. That worked fine with no conflicts, however pci express root port number three was rather flaky and would lead to spontaneous crashing. But I do know this video card does work with at least some Dell systems and can indeed run at an x4.1 link.



    Also I have noticed that when I boot into setup 1.30 with the GTX 970 connected, the status bar at the top right doesn't show how much free space there is available. It also doesn't show a pci allocation status yes* or no*. So something is being overrided by the eGPU, don't know about the software to say what.



    I have to solve this compaction issue of I want to do an x4.1 link on Windows 7 x64 as hot plugging the PM3Ns while in Windows does not seem to make it want to put the GTX 970 in 36bit space no matter what I do. I have tried uninstalling all GPU drivers, deleting the PCI root port entries in device manager. So I am stuck here with Windows 7 and the GTX 970 so far. Any ideas?



    Here are some benchmarks I did for the GTX 970 at x1.1 on an Express Card Link with the M6500











































































































































































































































    TEST TYPE

    x1.1 PCIe Link Result

    PCIe Speed Test Computer to Card Bandwidth, MB/s181
    PCIe Speed Test Card to Computer Bandwidth, MB/s199
    PCIe Speed Test Bidirectional Bandwidth, MB/s331
    3D Mark 06 3D Marks6511
    3D Mark 06 SM 2.0 Score2508
    3D Mark 06 SM.0 Score2309
    3D Mark 06 CPU Score4837
    RE5 DX9 1280x800 Variable FPS92.5
    RE5 DX9 1280x800 Fixed FPS42.5
    3D Mark Vantage Score23880
    3D Mark Vantage Graphics28093
    3D Mark Vantage CPU16470
    RE5 1280x800 DX10 Variable FPS122.2
    RE5 1280x800 DX10 Fixed FPS55.4
    3d Mark 11 Score8489
    3D Mark 11 Graphics9781
    3D Mark 11 Physics6212
    Uniengine Heaven 4.0 Extreme Preset 720p Score1339
    Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS53.2
    Uniengine Heaven 4.0 Extreme Preset 720p Max FPS128.8
    Uniengine Heaven 4.0 Extreme Preset 720p Min FPS11.5
    Guild Wars FPS (Eye of North Outside)168
    COD MW2 (Opening, looking towards trainees)13
    DMC4 DX9 Scene 1 FPS109.62
    DMC4 DX9 Scene 2 FPS80.887
    DMC4 DX9 Scene 3 FPS154.01
    DMC4 DX9 Scene 4 FPS59.1
    Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS58.2




    - - - Updated - - -





    In this next post I wanted to document my experiences with my trusty old GTX 460. It has been running perfectly now for over three years in various eGPU setups and implementations (worked with all four of my laptops). Maybe this GTX 460 incorporates some kind of pci reset delay switch that makes it advantageous for eGPU detection / avoiding BIOS interference...



    Same system as above, still on Windows 7 here. Plugging the GTX 460 in using the same standby / remove method worked with the M6500. Drivers installed properly and Windows moved eGPU to 36bit space. The nice thing about the GTX 460 is that I can boot with the eGPU attached and not affect the dGPU. I'm not sure why this is.



    Booting into setup 1.30 with the GTX 460 attached yeilds proper detection as expected. However, compaction works properly now. Essentially what I did was compaction on dGPU, eGPU with endpoint 56.25GB, forcing the dGPU to 32 bit. I also had to use the close unused bridges flag or I would hang on the Windows 7 loading screen. I have had to set this flag for all of my Dell laptops, but I still don't quite understand why I need it / what it does. I should also mention that I am was using a flash drive for this bit of testing. Since the flash drive worked without issues, I didn't try a disk image / secondary HDD install.



    Then I was able to boot the GTX 460 on mpcie port 1. If I do so without using setup 1.30, I get code 12 errors against some of my other devices in device manager. If I use setup 1.30, no such errors are present, so I will be sticking with that. Everything works properly, with the exception of the HD audio device, which has code 10 against it. I couldn't manually update the drivers for that specific device, so I will be trying a driver reinstall. I get an external output to the screen and was able to run a pci express link stress test and verify that I was getting full speed.



    I tried mpcie ports 2 and 3. They seem like they work at first, with the GTX 460 listed in device manager with no errors against it. However, if I actually try to use the card, the screen flashes and generally goes haywire. Lot's of 'Nvidia Driver has stopped Responding' errors followed by BSOD. I'm not sure why this would happen as I'm using the same cables I was using for mpcie port 1 which worked flawlessly. I should note that mpcie port 1 is a full height port while mpcie ports 2 and 3 are half height, so maybe they are power starved?



    To test the power starvation issue, I tried an x2 link. In the x2 link, all of the 3.3v power should come from mpcie port 1, with only 4 tx/rx pins being used for low voltage pci express data signals from port 2. However, that configuration was extremely unstable, leading to BSODs as before. I purchased this M6500 refurbished, and I think it's mainboard is bad. I took it apart, and there has to be some sort of short, as my keyboard backlight doesn't work, ethernet is hit or miss, and there is a popping noise that sometimes comes from the speakers just before windows loads. I really want all those auxiliary devices to work as I still use it as a laptop for engineering work, so I purchased a new mainboard for pretty cheap on ebay. Should arrive early next week for me to test.



    Hopefully I can get this thing working with the new mainboard (unless people can think of other reasons why ports 2 and 3 wouldn't work). The GTX 460 is detected properly and has no errors against it in device manager on ports 2 and 3, it just BSODs when I try to use it. I think the error is actually in the pci express signaling with possible garbled signal / noise (maybe from other faulty components such as the ethernet). It really makes me want to search the computer engineering department at our university for folks who might have a PCIexpress protocol scanner. Would be useful in figuring out what exactly is failing...



    I'm currently preparing a Windows 8.1 image so that I can try using the GTX 970 with the dGPU disabled. It's pretty easy to swap OSs with Macrium Reflect, so I'll be doing that until I have some permanent, tested working solution (staying far far away from my actual critical work OS install until then).



    Oh yeah, benchmarks for the GTX 460 on the Dell M6500











































































































































































































































    TEST TYPEx1.1 PCIe Link Result
    PCIe Speed Test Computer to Card Bandwidth, MB/s164
    PCIe Speed Test Card to Computer Bandwidth, MB/s198
    PCIe Speed Test Bidirectional Bandwidth, MB/s180
    3D Mark 06 3D Marks5321
    3D Mark 06 SM 2.0 Score2078
    3D Mark 06 SM.0 Score1976
    3D Mark 06 CPU Score2971
    RE5 DX9 1280x800 Variable FPS76.7
    RE5 DX9 1280x800 Fixed FPS38.8
    3D Mark Vantage Score7531
    3D Mark Vantage Graphics7003
    3D Mark Vantage CPU9729
    RE5 1280x800 DX10 Variable FPS69.4
    RE5 1280x800 DX10 Fixed FPS40.4
    3d Mark 11 Score2437
    3D Mark 11 Graphics2187
    3D Mark 11 Physics6201
    Uniengine Heaven 4.0 Extreme Preset 720p Score277
    Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS11
    Uniengine Heaven 4.0 Extreme Preset 720p Max FPS7.5
    Uniengine Heaven 4.0 Extreme Preset 720p Min FPS32.8
    Guild Wars FPS (Eye of North Outside)148
    COD MW2 (Opening, looking towards trainees)10
    DMC4 DX9 Scene 1 FPS103.07
    DMC4 DX9 Scene 2 FPS76.52
    DMC4 DX9 Scene 3 FPS141.37
    DMC4 DX9 Scene 4 FPS50.86
    Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS17.1




    - - - Updated - - -





    Still trying to compact with my GTX 970 attached, but I believe I am gaining some insight into the problem. I really need help here though, and I haven't found solutions by searching.



    First, even when compaction freezes and doesn't bring me back to the status window, setup 1.30 saves pci.bat. I can boot into setup 1.30, this time selecting the dos prompt, and call pci.bat. What's interesting is that it...actually runs without issue. The issue seems to be that the C: drive (usb stick) has fallen off of it's mount. I can still access E: (my windows drive), but there seems to be no C: access.



    I thought no problem, let me chainload using mbr since maybe the BIOS renumbered the HDDs. So I type 'call chainload mbr noremap'. That didn't work, so I then tried 'call chainload bootmgr noremap'. Also didn't work. Tried 'call chainload bootmgr'. Also didn't work.



    This problem seems entirely specific to having the 970 attached. If I leave the 970 attached and powered on before boot, I don't see the dell post screen, but setup 1.30 works fine. Can't do a compact without freezing (once again, due to USB drive being knocked off). I have also tried booting the system without the egpu attached into setup 1.30. Then I connect the egpu and make sure it's detected. Still the same problem with freezing after compaction. I tried a disk image install as well, and all it seems to do is corrupt the setup 1.30 install environment (I assume it can't do so on the USB drive since it's unaddressable / dismounted).



    I'm really not sure why this is happening. Booting up with the Galaxy GTX 460 causes no problems whatsoever. The machine posts with the Dell logo properly, and then continues to setup 1.30. There, I can do a compaction on 'dGPU eGPU', putting the egpu into 36bit space and forcing the dGPU to 32bit. Booting into Windows 7 works. All of this from the same USB stick.



    I did some initial digging, and found that the video card status display using nvflash works for the GTX 460, but not for the GTX 970. It simply says 'no Nvidia gpus detected'. This error message is in spite of the fact that the egpu is in fact detected and sitting on port 4 with the correct hardware id. I followed the isntructions for creating a loading a pcidump bin file to put the device on the pcibus using r-w everything, but it still doesn't show up in nvflash. I assume that this is because the version of nvflash included doesn't know about the GTX 970 series since it's so new.



    To that end, I have run the compact fail diagnostic. I am attaching the diag.zip file to this post if anyone cares to take a look at it. If you have a Dell Core 2 Duo or First Gen i7 business laptop, how did you overcome these issues if you had them?



    - - - Updated - - -





    Ever on the quest to solve my PCI compaction freezing and disk access issues, I believe I am hitting on the source of the problem. The Dell M6500 does not include native USB 3.0, so it instead attaches a USB 3.0 controller to the southbridge on PCI express root port number 7.



    I tried looking at the 32bit memory map in three different scenarios:

    No eGPU Plugged in at Boot or Hotplugged Later

    No eGPU Plugged in at Boot, but Hotplugged after booting into setup 1.30

    eGPU Plugged in at Boot, Taking Over as Primary Video Device (Get code 12 against dgpu if I boot into windows this way).



    I have included screenshots from my setup 1.30 32bit memory map dumps for all of these scenarios at the following dropbox link:

    Pictures Here



    What's interesting is that the USB 3.0 controller from NEC sits at the end of the 32bit space, which is where I belive setup 1.30 likes to put GPUs. I have noticed that the primary GPU instead sits at the beginning of the pci configuration space on this laptop.



    I have tried disabling USB 3.0 in the BIOS, but the controller still shows up in setup 1.30. If I disable root port number 7, the 'serial bus' description of the USB 3.0 controller transmutes into a 'mass storage controller' or basic USB. I can then set a 56.25GB endpoint, and try compaction on all PCI devices with the execption of SATA, USB, PATA, etc.



    What's interesting is that when I do so, despite not forcing any GPU to 32bit space, I get an error saying that no solution could be found. How exactly could no solution be found if I have all of that 36bit space available (I technically have 48.5GB to 56.25GB free in my DSDT override). Maybe setup 1.30 doesn't like to put both GPUs in the 36bit space?



    If I disconnect the egpu after booting, I am able to reallocate the dGPU back to the 32 bit space or 36bit space successfully by performing compact on scope dGPU. I have to select 'close unused bridges', otherwise I get a BSOD. I can then load to windows fine. If I have allocated the dGPU to 36bit space, I get the vista style loading screen, but my dGPU functions perfectly in Windows, with no conflicts. Allocating the dGPU to 32 bit space with the eGPU disconnected after booting into setup 1.30 gives the same working result, but this time with the fancy windows 7 loading screen.



    I also took the time to document the EVGA GTX970's PCI space requirements. After some math, it appears to require 256MB + 32MB. Perhaps this is the distinction that is causing problems vs. the GTX 460, which I believe requires a total of 256MB like most other GPUs. Can anyone confirm this is the case for all GTX 970s?



    In addition, sometimes my GTX 970 is detected with strange PCI Hardware IDs. The normal hardware ID is 0x13C2. So far, I have seen 0x0008, 0x0408, 0x13C0, 0xC042, and 0xE554. Sending a hot reset to the pci express port always fixes the problem.



    I have a feeling this problem is solvable via software, as hotpluggiing via standby resume method works in windows 7 for express card, allocating the eGPU to the 36bit space. I can't do the same method for the PM3Ns though, I get code 12 on any of ports 1 - 4.



    I looked at the IRQs of all of the pci express ports, and it appears that ports 1 and 5, 2 and 6, 3 and 7, as well as 4 and 8 have IRQs of 16, 17, 18, and 19 respectively. However, disabling port 7 for example while hot plugging to port 3 still gives me error 12 in Windows 7.



    I will note that occasionally, after pci compaction in setup 1.30 (that includes saving PCI.bat), I still get the system freeze, but my system spontaneously shuts down. I'm not sure why it's doing that. I have a new motherboard on the way to fix my bad ethernet card, so I will try with that later in the week.



    I hope this information helps anyone who can provide some insight into the problem at hand. If more information is needed, let me know and I will gladly provide it.
    • Thumbs Up 1
  9. So I just wanted to provide an update. I am currently testing an EVGA GTX 970 with my M4400. As I suspected, there is no way (that I have discovered) to install the r343 drivers for the GTX 970 as well as the r340 drivers for the Quadro FX 1700m (DX10 GPU at the same time). This problem is the single greatest hurdle to the egpu setup with the GTX 970.

    I also have a DELL E6500 with a x9100 cpu and integrated intel graphics, so I have been testing with that configuration. I plan to make this an x1.opt setup with the GTX460, but in the meantime have been using it for x1 and x4 testing of the GTX 970. However, I am having issues with one of the mpcie ports / riser cables (port #3). It returns errors and generally offers less performance than all the other ports. Want to try this again with shorter ribbon cables later.

    I'm kind of wondering if I should swap the GTX 970 back for the R9 290. The GTX 970 with the Dell E6500 is great. It doesn't even require pcie compaction (DSDT override), just port switching to enable x4.1. However, due to the aforementioned instability issue on port 3, it's still not working perfectly. The GTX 970 has superior noise and thermal characteristics as well as more advanced driver support and also doesn't require wonky hotplugging of mpcie ports.

    I am also making progress on the enclosure. I have purchased a 4'x2' sheet of cheap particle board from which I will make a prototype enclosure. I have drawn up the cutout in a Google Sketchup file for use by the laser cutter at my school. The case is designed to be roughly the same planform area as the laptop, with the laptop sitting on top.

    I have updated my build log dropbox link with additional pictures. The circuit board you see next to the GTX 460 on the desk is a MT6820-b that takes an external GPU input and converts it into LVDS (same signal used by laptop internal display). Found at: Amazon.com : Beautyforall MT6820-B 10 Inch To 42 Inch 5V Universal LVDS LCD Monitor Driver Controller Board With Cable : Camera & Photo . It just so happens that this pcb fits inside my egpu caddy. In combination with a small micro hdmi to vga adapter and a custom switching board I want to design and build (using a MAXIM MAX14979E LVDS switching integrated circuit), I should be able to have a switchable external input into my internal laptop LCD.

    With USB boot enabled, my M4400 and E6500 boot and run startup.bat extremely quickly. I found that in my old egpu case, the R9 290 was probably not as fully inserted in the slot as it should have been, which possibly fueled its detection issues. I have since rectified that on the GTX 970, which gave me no boot at all until I removed that particular problem. I'm still learning a lot with all this, but I'm determined to see it through to eventual completion. If my Dell E6500 could support my QX9300, I would be golden now. However, I still use my laptop for intensive computational work as a laptop, so the built in Quadro is useful (GPGPU and 3D Modeling).

    With that said, I'm linking to a different picture archive with pics of the GTX 970 setup. I'm going to the laser cutter tomorrow, though you could just as easily cut it out with conventional tools if you wanted. Files are included in the dropbox link:

    [URL]https://www.dropbox.com/sh/r8j8scgafditgzh/AABggihY0gK2g7-HDFmDBwAOa?dl=0[/URL]

    I have also completed some additional benchmarking. Here are some results in progress:

    TEST TYPE
    Dell M4400
    Dell E6500
    Dell E6500
    R9 290 @x1.1 R9 290 @x4.1 % Increase GTX 460 @x1.1 GTX 460@ x4.1 % Increase GTX 970 @X1.1 GTX 970 @x4.1 % Increase
    3D Mark 06 3D Marks 17082 17082 0.0% 5447 12920 137.2% 6160 15117 145.4%
    3D Mark 06 SM 2.0 Score 6125 6125 0.0% 2156 6026 179.5% 2535 6755 166.5%
    3D Mark 06 SM.0 Score 8988 8988 0.0% 1992 5715 186.9% 2218 7770 250.3%
    3D Mark 06 CPU Score 4429 4429 0.0% 3054 3080 0.9% 3111 3103 -0.3%
    3D Mark Vantage Score 18339 20015 9.1% 7283 9572 31.4% 15317 16252 6.1%
    3D Mark Vantage Graphics 22634 25924 14.5% 7574 11110 46.7% 26917 30300 12.6%
    3D Mark Vantage CPU 11686 11887 1.7% 6531 6764 3.6% 6680 6797 1.8%
    3d Mark 11 Score 6991 8188 17.1% 2495 2922 17.1% 6370 6517 2.3%
    3D Mark 11 Graphics 9661 12836 32.9% 2393 2978 24.4% 11115 13563 22.0%
    3D Mark 11 Physics 3828 3967 3.6% 2697 2754 2.1% 2774 2535 -8.6%

    Guild Wars FPS (Eye of North Outside) 19 65 242.1% 87 345 296.6% 150 353 135.3%
    COD MW2 (Opening, looking towards trainees) 8 22 175.0% 8 30 275.0% 12 30 150.0%
    Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS 55.8 73.5 31.7% 17.4 25.8 48.3% 59.4 69.5 17.0%


    If anyone knows how to test the pcie realworld bandwidth of a r9 290, let me know. Pciespeedtest v0.2 doesn't work for me. May have to modify its source code to include the R9 200 Series. Once again, suggestions regarding the conflicting driver situation are welcome. There is not igpu in the M4400, but maybe I can disable the dgpu in setup 1.30 such that the Quadro and GeForce are never running at the same time (giving BSOD due to conflicting NVIDIA drivers).
    • Thumbs Up 1
  10. I am in the middle of some benchmarks now. The benchmarked games are pretty much what you saw above. I have completed x1.1 testing with the R9 290 and am moving through x4.1 testing now. I got much smaller performance increases in moving from x1.1 to x4.1 with the 290 as opposed to the GTX 460. I've been looking for a working utility to try and test the real throughput of the AMD card's pcie interface, but to no avail. Pciespeedtest.exe doesn't work on my system for some reason.

    I am also seeing some relatively low DX9 performance (slower than the GTX 460 in some games). Not really sure what is going on here. Maybe the R9 290 doesn't think those games are quite worth it's time and it never comes out of downclocked mode?

    Let me know if you want more benchmarks in the meantime. I will update when complete. My 3d Mark 11 GPU scores are 9661 and 12836 for x1.1 and x4.1 respectively.

  11. So I got the R9 290 to run in x4 mode with the following procedure:

    PERST and CLCKRUN delays all set to 0.

    Boot with mHDMI Connected to port 1 into setup 1.30

    Change link width on port 1 to be x4.1

    Connect Ports 2, 3, and 4 mHDMI cables

    Send hot reset to port 1 to register the x4.1 link

    Perform pci compaction, assigning egpu to 36bit space

    Chainload into Windows

    Unfortunately, this is not what I was going for. I was hoping that it would simply be a push of the laptop power button as it is with the GTX 460. I really am interested in trying the GTX 970, but don't know about concurrent driver support for my Quadro FX 1700m.

    If I have the other mHDMI cables connected at boot, the R9 290 spins up to maximum fan speed (very loud) and eludes detection even after setting the x4.1 link and rebooting. The GTX 460 on the other hand does not exhibit this behavior. Any idea why the Radeon is doing what it's doing?

    In both cases, if all four mHDMI cables are left plugged in, the eGPU will not be detected on the first try. For the Nvidia GTX460, it simply requires setting the x4.1 link speed and rebooting. As mentioned above, this procedure does not work for the R9 290.

  12. I have a question then. All of my PM3Ns are version 1.2, which has the CLCKRUN delay switch. What event is this delay relative to? From looking at the pci express pinout, each lane after one has a refclock in addition to two tx/rx differential pairs. So I guess CLCKRUN is really needed. The question is, how do I make everything start in sync?

    - - - Updated - - -

    Actually, nevermind. It seems that http://pinouts.ru/Slots/pci_express_pinout.shtml gives a better explanation. I believe that that small portion after the key is still the part of the first lane. All other lanes do not rely on their own independent CLCKRUN signal. Thus, it seems it would be better for me set the CLCKRUN delay on ports 2 and 4 such that an x4.1 link is set before they have a chance to mess up detection. Will try after I'm done benchmarking at x1 speeds.

  13. Nando,

    I actually went and bought a Diamond R9 290 today. Testing it out now. I also uploaded some clearer pics to the dropbox folder.

    So far, I can only get the R9 290 to detect and work on an x1 link using port 1. When I connect all four cables, I don't have any hangup issues during POST, but the fans spin to maximum speed and the card is not detected. I suspect all this has something to do with the PM3N's CLCKRUN delay switches. I will try switching ports 2 and 3 to have the maximum delay in the hopes of setting an x4.1 link before the CLCKRUN signal conflicts. An alternative may be to tape the offending pins, although that is something I would rather avoid.

    The x1.1 R9 290 results look very promising. About twice the performance of the x4.1 GTX 460 so far.

    I am using same eGPU hardware I was using with the Latitude E6500. I managed to fry the mobo with the dGPU somehow, so I bought a cheap mobo with the Intel x4500MHD igpu so that I can use my spare PE4L to do an x1.1opt link. It should be a good backup machine for family / guests (have all adapters and will use XBOX360PSU).

    • Thumbs Up 1

  14. So I've been working on this egpu build off and on as a hobby, and I'm getting close to finishing. I have two Dell M4400's and a Dell E6500. This thread will focus on my quest to make a clean x4.1 PE4H setup with one of the Dell M4400s. In addition, I hope to perform some electrical wizardry to create a switchable micro HDMI input for the internal LCD.



    Notebook: 15" Dell Precision M4400

    CPU: Intel Core 2 Quad QX9300 @ 3.06 GHz RAM: 8.0 GB DDR2 800 Mhz Internal LCD: 1920x1200 WUXGA 2CCFL dGPU: Nvidia Quadro FX1700m BIOS: Revision A19 OS: Windows 7 Professional x64

    eGPU gear

    PE4H 2.4a 1x Express Card Adapter with 60cm Flat mHDMI cable 3x PM3N 3x Cable Matters mHDMI to HDMI Female adapters 3x Cable Matters mHDMI to HDMI Cable (3 feet long each) 3x P22S-P22F mPCIe Extenders 1x Corsair CX430 PSU

    eGPUS(on hand): Galaxy Nvidia GTX460 768mb, Sapphire x1950 Pro eGPU(to Purchase): Diamond Radeon R9 290 or GTX 970

    External Monitor: Casio XJ 245v Laser-LED Projector

    eGPU Setup 1.30 software

    The port layout of the Dell M4400 (and Dell E6500) is as follows

    Port 1: WWAN mPCIe Port 2: WLAN mPCIe Port 3: Bluetooth mPCIe Port 4: Express Card

    This port layout is great for egpus, and means we can attempt an x4.1 link. However, these systems have TOLUD = 3.5GB, and require a DSDT override for the eGPU to coexist with the dgpu (there is no igpu). I have performed such an override (my DSDT syntax was the same as reported by avlan) on the computer, extending the rootbridge to the 12.25GB + 4 GB endpoint (0x417FFFFFF). Using the 12.25GB endpoint in setup 1.3 works just fine for the memory allocation.

    Installation

    So far, I have managed to get the GTX 460 working at an x4.1 link. The main stumbling point was the pci compaction, which I will list below in case someone runs into the same issue. These steps assume you have already performed the DSDT override above.

    1. Connect 3 PM3Ns and 1 Express Card to PE4H. Plug in eGPU and switch PSU on.

    2. Boot computer normally and load Setup 1.30 (Only from disk image, USB stick install freezes after compaction currently)

    3. The eGPU won't be detected on the first boot, so set x4.1 link on port 1 and reboot using the menu

    4. After the reboot, go into setup 1.30 again, the egpu should be visible and at an x4.1 link. If not, a cable may be faulty or not connected well.

    5. Perform PCI compaction with the following options:

    Scope: All devices Endpoint: 12.25 GB Options: Close unused bridges, force32 dGPU

    6. Chainload to OS, should work.

    Once in Windows, go ahead and install the Nvidia GPU drivers. I used the 337.88 GeForce drivers for both GPUs. Keep in mind that with two NVIDIA GPUs, using different driver versions is a no go and will lead to instability issues. Since Nvidia has dropped support for DX10 gpus such as my FX1700m (Geforce 9700m GT) as of driver revision 340.52 , that means that upgrading to a GTX970 would prove troublesome as it requires driver revision 344. If anyone has any experience running a GTX 970 with a Nvidia DX10 dgpu let me know.

    The above covers the software setup for at least the GTX460. I was not able to achive x4.1 detection on the x1950pro (it did work on x1.1 using only one cable). I suspect that I will need to play with the delay switches on the port 2 and 3 PM3Ns as their CLKRUN signals may be interfering with port 1. Nando has informed me that only 4 TX / RX pins are used on ports 2 - 4 to make the 2nd, 3rd, and 4th lanes. Thus, if I can delay the CLKRUN signals on the Port2 and 3 PM3Ns, I may be able to set an x4.1 link on port 1 before the redundant CLKRUN signal is started. For those of you that have experience with this, please let me know if you have further suggestions. Will be trying this when I get more time.

    On to the physical setup. I experimented with many different adapter combinations, but I found the cablematters cables to be the absolute best. Everything else eventually gave me errors or resulted in slightly less bandwidth (indicating excessive retransmission). I'll cut it short here as this is where the majority of my time was spent. Each cable combination was tested using a CUDA based PCI express bandwidth checker and the Crysis 2 benchmark tools. Unstable links simply ended up blue screening before passing either of those tests. Port 1 is the most important port, and faulty links sometimes went unnoticed on other ports. I eliminated those by testing in x1 mode on each port independently. Only when the setup could pass that did I try an x4.1 link again. I have been using the cable setup mentioned above now for a little while, and it has passed numerous benchmarks flawlessly.

    I wanted to make this a clean, internal setup, so I went with the eGPU caddy approach. Essentially, I used the P22S-P22F extenders, and removed my laptop DVD drive. I then proceeded to remove the optical drive's guts so I could use the space for the P22S-P22Fs. By happy coincidence, the width available in the optical drive slot was almost exactly with the width of 3 P22s boards. To each of the circuit boards, I attached a mHDMI to HDMI female adapter, making the whole setup more easily pluggable without removing the bottom cover. Internally, the P22S-P22F board was connected to each mPCIe slot by way of a FFC ribbon cable provided in the kit (you can also buy these on digikey or a similar electronics shop).

    To make everything fit and look nice, I cut out a hole in the bottom of the optical drive casing with a dremel to enable easy connecting/disconnecting of the cables. In addition, I designed a caddy face solid model that I 3D printed at my University's 3D printing lab to make the face. I then screwed everthing together, and the result is a self contained eGPU connector optical drive module. It also has enough space for the circuitry I plan to include that would enable external input into the internal LCD.

    I have included pictures of the current setup in the following dropbox folder: [URL]https://www.dropbox.com/sh/r8j8scgafditgzh/AABggihY0gK2g7-HDFmDBwAOa?dl=0[/URL]

    Everything's a bit of a mess right now as I square everything away. But my plan is to make a laser cut self contained enclosure that will also double as a laptop stand. I am also looking into using some the CNC equipment around here to mill my own copper heatsink fan assembly with quick disconnect water cooling connections for dockable water cooling (still doing research on the fittings, but seems feasible using a cheap all in one cpu cooler and a concept similar to this link: [URL]https://www.youtube.com/watch?v=uhw1n1N0hD0[/URL]). Once I am done with the encloser, it should have the same planform area as the laptop and make all cables / power adapters stowable for easy portability in a backpack. The idea is to be able to pull this system out, plug one power cord into the wall, and connect the laptop using the internal LCD for a portable, powerful mobile workstation.

    I'm not quite done with a full scaling performance assesment, but the numbers look good so far, here is a preview that I will update as I get further along. I'm particularly interested in trying out the R9 290 as it has good performance at an x4.1 link. The best eGPU would be a GTX 970, but I think that is out due to the aforementioned conflicting Nvidia driver version issue. I will continue to update this thread as I get further along (look for another update involving the E6500 as well).

    Performance

    TEST TYPE Galaxy GTX 460 768MB x4.1 PCIe Link Result
    PCIe Speed Test Computer to Card Bandwidth 616 MB/s
    PCIe Speed Test Card to Computer Bandwidth 810 MB/s
    PCIe Speed Test Bidirectional Bandwidth 720 MB/s
    3D Mark 06 13822 3D Marks
    3D Mark 06 SM 2.0 Score 5874
    3D Mark 06 SM.0 Score 5745
    3D Mark 06 CPU Score 4340
    RE5 DX9 Variable 98.0 FPS
    RE5 DX9 Fixed 49.8 FPS
    3D Mark Vantage P11340
    3D Mark Vantage Graphics 11132
    3D Mark Vantage CPU 12017
    RE5 DX10 Variable 98.2 FPS
    RE5 DX10 Fixed 48.9 FPS
    3d Mark 11 P3144
    3D Mark 11 Graphics 3031
    3D Mark 11 Physics 3956
    Uniengine Heaven 4.0 907
    Uniengine Heaven 4.0 Avg FPS 36.0 FPS
    Uniengine Heaven 4.0 Max FPS 81.2 FPS
    Uniengine Heaven 4.0 Min FPS 16.5 FPS
    Guild Wars FPS (Eye of North Outside) 381 FPS
    COD MW2 (Opening, looking towards trainees) 30.0 FPS
    DMC4 DX9 Scene 1 227 FPS
    DMC4 DX9 Scene 2 178 FPS
    DMC4 DX9 Scene 3 275 FPS
    DMC4 DX9 Scene 4 142 FPS
    Crysis 2 DX11 Xtreme Times Square Adrenaline 26.0 FPS
    • Thumbs Up 2
  15. I apologize in advance for the length of this post, but I have already invested a lot of time in getting this up an running, and I wanted to provide as much info as possible for troubleshooting.

    So I have posted about my setup before. I am in the process of achieving an x4.1 link on my Dell e6500. So far, I have achieved an x2.1 link that seems stable, minus a few other issues. I have been unable to achieve an x4.1 link thus far as I don't have a reliable way to fit the bulky mHDMI connector in the half height mPCIe slot previously occupied by the wifi card (the bottom of my laptop, for reference). I have purchased a device I think will help me with this, located here. I will provide an update once I receive it. In the meantime, I was able to achieve an x2.1 link with two PM3Ns connected to the half height WLAN slot and former WWAN port, but it was not easily repeatable (loose half height connection). In no case was I able to detect the egpu when all four PCIe slots were occupied.

    For the stable egpu configurations at x1.1 and x2.1, I have been piecing together some performance information. The stable x2.1 connection involves one express card connector connected to root port 4 and one PM3N connected to root port 3 (where the bluetooth card used to be).

    To verify my pci express connections, I ran the following bandwidth checker I found on the evga forums.

    I have noted that certain games (COD MW2 and Orcs Must Die 2), seem to trigger a BSOD citing nvlddmkm.sys as a cause. This error occurs in x1.1 and x2.1 modes, regardless if my card is overclocked or not. My card passes a 15 minute Furmark test, and my CPU has been tested stable for over 10 hours in Orthos. In addition, I have run a video memory tester, which also has returned no errors. In addition, I am able to complete all 3D mark testing as well. The only tests I had problems with were the DMC4 tests, which seemed to hang on my card (no BSOD, though).

    Here are the eGPU video connections: eGPU DVI 1: Connected to 23inch Viewsonic monitor eGPU mHDMI: connected to 5 port hdmi switch with a 50 foot hdmi cable out of the room, and the switch is connected to a Casio XJ-a245v projector. Sound output is enabled for this reason.

    Depending on whether I select the PC as an input at the switch, the egpu switches from single 1080p monitor to cloned 720p monitor mode. All testing was completed in single monitor mode, so I don't think the other monitor connection is causing my problems, but just stating that upfront.

    In addition, I am running a Corsair CX430 power supply connected to a Kill A Watt power meter. In normal, idle use, the eGPU seems to use around 50 watts. When 3D mark 06, RE5 var, or similar demanding 3D application is being used, the power consumption at the outlet jumps to 110 watts. When furmark is running, the power consumption jumps to a whopping 210 watts. My PSU is rated to handle these power demands with ease, though it could maybe be causing something. The fact that it doesn't crash even during extended furmark sessions tells me the PSU is not the problem here.

    I am currently running the driver verifier included in windows 7 to ferret out any driver incompatibilities. For what its worth, COD MW2 is much more stable than Orcs Must Die 2, but I would like to play both of those games.

    I have also included some pics of my setup. Here is one, another, another, and another.

    Without further ado, here are my current test results.
    DELL E6500 eGPU TESTING AT DIFFERENT PCIe Link Speeds
    OS: Windows 7 Professional x64
    CPU: Intel x9100 C0 Stepping 3.458 GHz at 1.285 Vcore
    CPU COOLING: DELL Precision M4400 Copper Heatpipe Cooler Installed in e6500
    RAM: 4.0GB DDR2 800 MHz, DSDT Override Enabled
    eGPU: Galaxy GTX460 768 MB
    eGPU Clocks: Core Clock 800 MHz, Memory Clock 1000 Mhz
    TEST TYPE x1.1 PCIe Link Result x2.1 PCIe Link Result
    PCIe Speed Test Computer to Card Bandwidth 163 MB/s 327 MB/s
    PCIe Speed Test Card to Computer Bandwidth 197 MB/s 291 MB/s
    PCIe Speed Test Bidirectional Bandwidth 177 MB/s 308 MB/s
    3D Mark 06 5447 3D Marks 9198 3D Marks
    3D Mark 06 SM 2.0 Score 2156 4000
    3D Mark 06 SM.0 Score 1992 3640
    3D Mark 06 CPU Score 3054 3042
    RE5 DX9 Variable 75.0 FPS 88.0 FPS
    RE5 DX9 Fixed 37.7 FPS 44.5 fps
    3D Mark Vantage P7283 P8911
    3D Mark Vantage Graphics 7574 10155
    3D Mark Vantage CPU 6531 6516
    RE5 DX10 Variable 71.8 FPS 93.5 FPS
    RE5 DX10 Fixed 40.4 FPS 48.5 FPS
    3d Mark 11 P2495 P2904
    3D Mark 11 Graphics 2393 2980
    3D Mark 11 Physics 2697 2695
    Uniengine Heaven 4.0 390 594
    Uniengine Heaven 4.0 Avg FPS 15.5 FPS 23.6 FPS
    Uniengine Heaven 4.0 Max FPS 52.0 FPS 43.9 FPS
    Uniengine Heaven 4.0 Min FPS 10.0 FPS 12.8 FPS
    Guild Wars FPS 87.0 FPS 131 FPS
    • Thumbs Up 2
  16. So I wanted to provide an update to my configuration. For a couple of years, I have been running a GTX 460 using a PE4L v1.5 x1 express card link.

    I bought a pe4h and pm3n kit from Mfactors storage, and I was able to achieve an x2 link on a dell latitude e6500.

    Specs:

    OS: Windows 7 Pro 64 bit

    CPU: Intel e8435 Core 2 Duo @ 3.06 GHz

    RAM: 4GB (required DSDT override due to TOLUD = 3.5 GB)

    eGPU: Galaxy GTX 460 768mb, PE4H v2.4b, PM3N Port 1, Express Card Port 2

    dGPU: Nvidia Quadro NVS 160m

    The e6500's pci express layout is as follows (ICH9M Chipset):

    Port 1: WWAN

    Port 2: WLAN

    Port 3: Bluetooth

    Port 4: Express Card

    With the end goal of achieving an x2 link, I first tried replacing the bluetooth on port 3 with a PM3N card hooked to port 1 of the PE4H v2.4b. The express card was hooked to port 2 of the PE4H.

    When using setup 1.x (version 10e8), I the GPU was detected; however, when enabling x2 link on port 3, the side effect was that port 1 was also set to x2, causing problems with the WLAN card I had relocated to that port. In every instance, after doing the PCI compaction, the computer froze at the beginning of windows loading. In addition, I found that I was unable to detect the egpu the after setting port 3 to x2. No matter what I did, I had to reboot from within the setup 1x menu to achive detection.

    When I did reboot from within setup 1x after enabling a x2 link on port 3, something interesting happened. Ports 1 and 3 were still listed as x2 ports, but somehow ports 2 and 4 were also enabled at x1 link speed. I'm not sure why this happened.....

    In any case, I simply disabled ports 2 and 4 from within setup 1x, and proceeded with 36bit PCI compaction. I forced the NVS160m dgpu to be allocated to 32 bit space (no igpu on this laptop), and the GTX 460 was allocated in 36bit space (I had already performed a DSDT override).

    After all of this was done, chain loading into Windows 7 provides the desired result, with my egpu confirmed working at x2 1.0 and my dgpu confirmed working as well. I noticed a significant performance increase. This is great.

    I have two problems with things as they are now.

    1.) I'm not able to run my mpci WLAN card in port 1. Can I not run an x1 device using one lane of the port 1 x2 connection? Or have I made an error with regards to compaction?

    2.) The requirement for me to reboot from within the setup 1x menu for detection of my egpu after setting the x2 links is troubling. As a result I am unable to create a startup.bat file that would be plug and play.

    I eventually want to move towards an x4 1.0 link for my egpu. That will have to wait until I obtain 2 more PM3Ns. Unfortunately, M factors storage would not sell me the PM3Ns directly, so I have to wait for overseas shipping from HWtools. I have already purchased a USB wifi adapter that seems to work well, but I was wondering if I could enable my internal wifi adapter in the interim (much less clunky solution, as I move my laptop a lot).

    I currently have the cables routed in such a way that I can plug and unplug the PM3N almost as easily as the express card. I think I have an unused PCMIA port on my laptop that I would consider turning into a 3 connector micro hdmi port hub (one for each PM3N). That would enable the solution to be truly plug and play. I will be testing the effectiveness of micro-hdmi connectors tomorrow when I receive those cables.

    That's all for now. Any suggestions?

  17. Hi all,

    I've been happily using my egpu setup for the past two years. When I first set it up, I was unable to use 4GB of RAM due to the dell TOLUD issue. I tried using DSDT override, but at first it merely broke my system (I was on Windows 7 32bit at the time), and I eventually left it alone. Recently, I have been contemplating some upgrades to my setup, and I started by installing Windows 7 Pro x64 from scratch and changing back to 4GB of RAM. After installing all drivers (including those for the built in Quadro NVS 160m), I implemented the DSDT override using the instructions found on this board (mimicking the Dell M6500). After connecting the eGPU, windows automatically allocated it using the 36bit space, and I was able to install the drivers though the standard GeForce setup tool. After rebooting, everything worked including the NVS 160m dGPU, and I didn't end up requiring any pre-boot modification.

    System Specifications:

    MODEL: DELL Latitude e6500

    OS: Windows 7 Professional 64 bit

    CPU: Intel e8435 @ 3.06 GHz

    RAM: 4GB DDR2-800

    Internal GPU: Quadro NVS 160m

    eGPU Connection: PE4L 1.5 with express card

    eGPU PSU: Corsair CX430

    eGPU GPU: Nvidia GTX 460 768MB

    Here is the link to my original setup at the old thread:

    http://web.archive.org/web/20120713075400/http://forum.notebookreview.com/gaming-software-graphics-cards/418851-diy-egpu-experiences-847.html#post8324201

    Regarding the upgrades I would like to implement, I have examined the pci-express layout on my computer, and I think I can achieve an x4 1.0 link.

    http://www.laptopschematic.com/wp-content/uploads/2010/11/Dell-Latitude-E6500-UMA-Block-Diagram.png

    Here are the port assignments:

    Port 1: WWAN (not currently installed)

    Port 2: WLAN (wifi, currently installed)

    Port 3: Bluetooth (currently installed)

    Port 4: Express Card (Currently eGPU)

    I think I can get rid of wifi and Bluetooth by installing dongles to supplement those functions, leaving the door open for x4.

    Here is what it looks like on the bottom side of my laptop, showing the two occupied mpcie ports as well as the empty WWAN port.

    https://www.dropbox.com/s/q8bz2vhhl2pqo0r/e6500.jpg?dl=0

    I would very much like to try an x4 link, as I am bandwidth limited in many applications, but I don't have the necessary hardware. I think I can get a PE4H from M-Factors storage with a PM3N adapter. I would still need two more PM3Ns though, and I haven't been able to find them from a US Seller. If anyone has any extra/unwanted PE4Hs or PM3Ns and wants to arrange a sale, please let me know. I am located in the Atlanta, GA area if that helps. This community has been a great help to me so far.

    • Thumbs Up 1
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.