Jump to content
EwinRacing Flash Series Gaming Chairs

hishamkali

Registered User (Promoted)
  • Content Count

    18
  • Joined

  • Last visited

Community Reputation

30 Semi Elite

1 Follower

About hishamkali

  • Rank
    Junior Member
  • Birthday 01/14/1991

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. As far as PSU power consumption, the HDPlex PSU is specced up to 400w peak power consumption, which I imagine it achieves using the large bank of capacitors attached to the PCB. I have a kill-a-watt power meter, and I used it to measure the power consumption. All power consumption is measured at the wall, and thus includes power for the AKiTio board and attached fan. I believe the HDPlex advertises 94% efficiency. So I would expect the actual GPU power consumption to be about 90% of the results given below. At 100% TDP in Furmark (worst case scenario), the card passed the 15min burn in test without issue. Power consumption was 200w. Peak temperature for the GPU was 83 degrees Celsius with a final fan speed of 70%. While running the Crysis 2 benchmark using the 'extreme' preset, the highest power consumption I saw was 178w, and then only briefly. I would say that 150w would be typical during gaming. Here is a picture of the setup on my desk: As far as the case, I received my extra acrylic in the mail today. The idea would be for the laptop to sit on top of the acrilic case, which would have no air intakes on top of it. It would identical to the laptop sitting on the desk. There would be a 120mm intake on the bottom of the acrylic case, which I could suspend above the desk using the rubber feet from the AKiTio enclosure. Similarly, I could install the PSU upside down such that air would be drawn in from the underside, as opposed to fighting the laptop for air. The laptop cpu gets pretty hot if I run prime 95 on all 8 threads, quickly throttling down to about 2.7GHz with a temp of 90 degrees Celsius. The 4 core turbo speed is 3.5GHz. I set the turbo time limit to basically infinity (max number of seconds), and I also raised the CPU current limit to 100A and the TDP from 58W to 63W. I will try a repaste with some MX-4 I have handy while I have the computer apart to install my new IPS screen. Maybe I can also drive the extra laptop lcd externally to create a mobile dual monitor setup when I need it...
  2. I hope the information provided here helps others who want to try a Windows based Thunderbolt eGPU. Cost breakdown:HP Zbook 15 G2 : $1600 from PCNation or BH Photovideo EVGA GTX 970 : $350 AKiTio Thunder 2: $229 HDPLEX 250W : $85 Dell PA-9E 240w : $60 on eBAY (I had one from my previous laptop) HP 230w Adapter : (Also works, slightly cheaper) $25 - $30 on eBAY Acrylic : $25 Brackets, Screws: $15eGPU GPU + Adapter + Enclosure + PSU Total: $763Laptop + 32GB DDR3: $1900Total System Cost : $2663 Overall not bad for a highly capable, expandable mobile workstation. I use this thing for my work, and I hadn't really had a new computer for 6 years, so it made sense to get something that just worked and had a good warranty.I ran a bunch of benchmarks as well. Results are included in the table below. The full spreadsheet containing all of my eGPU Benchmarks across all systems to date can be found at the following link: [URL="https://www.dropbox.com/s/bt9igxw7uy9yiql/Comprehensive_Testing.xlsx?dl=0"] eGPU Testing Excel Sheet[/URL]Benchmark Configuration: OS: Windows 8.1 Professional x64 CPU: Intel i7 4710MQ @2.5 GHz CPU COOLING: HP Zbook 15 G2 Stock Cooling RAM: 32.0GB DDR3 1600 MHz eGPU: EVGA GTX 970 4.0GB AC 2.0 eGPU Clocks: Core Clock 1165 MHz, Memory Clock 1753 MHz, Boost Clock 1317 MHz Benchmark Results: TEST TYPE x4.2 Int Screen Result x4.2 Ext Screen Result Percent Difference from Int to Ext Screen PCIe Speed Test Computer to Card Bandwidth, MB/s 1266 1263 0.24% PCIe Speed Test Card to Computer Bandwidth, MB/s 1373 1369 0.29% PCIe Speed Test Bidirectional Bandwidth, MB/s 1845 1845 0.00% 3D Mark 06 3D Marks 25497 28052 -9.11% 3D Mark 06 SM 2.0 Score 10378 11460 -9.44% 3D Mark 06 SM 3.0 Score 11656 13296 -12.33% 3D Mark 06 CPU Score 7177 7335 -2.15% RE5 DX9 1280x800 Variable FPS 186.1 237.5 -21.64% RE5 DX9 1280x800 Fixed FPS 128.2 154 -16.75% 3D Mark Vantage Score 29383 32228 -8.83% 3D Mark Vantage Graphics 31987 36695 -12.83% 3D Mark Vantage CPU 23616 23606 0.04% RE5 1280x800 DX10 Variable FPS 187.6 230.6 -18.65% RE5 1280x800 DX10 Fixed FPS 122.4 136.1 -10.07% 3d Mark 11 Score 10313 10775 -4.29% 3D Mark 11 Graphics 12007 12507 -4.00% 3D Mark 11 Physics 7490 7620 -1.71% Uniengine Heaven 4.0 Extreme Preset 720p Score 1796 1890 -4.97% Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS 71.3 75 -4.93% Uniengine Heaven 4.0 Extreme Preset 720p Max FPS 153.6 164.9 -6.85% Uniengine Heaven 4.0 Extreme Preset 720p Min FPS 25.3 25.8 -1.94% Guild Wars FPS (Eye of North Outside) 331 650 -49.08% COD MW2 (Opening, looking towards trainees) 61 75 -18.67% DMC4 DX9 Scene 1 FPS 275.81 440.89 -37.44% DMC4 DX9 Scene 2 FPS 261.63 368.94 -29.09% DMC4 DX9 Scene 3 FPS 261 346.5 -24.68% DMC4 DX9 Scene 4 FPS 200.98 235.17 -14.54% Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS 78.2 88.2 -11.34% COD MW2 (Opening,looking opposite trainees) 165 460 -64.13% 3D Mark 13 Fire Strike 7649 8407 -9.02% 3D Mark 13 Fire Strike Graphics 8525 9749 -12.56% 3D Mark 13 Fire Strike Physics 9045 9253 -2.25% 3D Mark 13 Sky Diver 18043 21088 -14.44% 3D Mark 13 Sky Diver Graphics 23605 30277 -22.04% 3D Mark 13 Sky Diver Physics 8408 8662 -2.93% 3D Mark 13 Cloud Gate 17236 20565 -16.19% 3D Mark 13 Cloud Gate Graphics 33176 53386 -37.86% 3D Mark 13 Cloud Gate Physics 6428 6525 -1.49% 3D Mark 13 Ice Storm 39087 126811 -69.18% 3D Mark 13 Ice Storm Graphics 38479 265629 -85.51% 3D Mark 13 Ice Storm Physics 41376 44824 -7.69% 3dmark benchmark links3dmark-FS.GPU=9749 : [url]http://www.3dmark.com/fs/4052075[/url] 3dmark11.GPU=12507 : [url]http://www.3dmark.com/3dm11/9424206[/url] 3dmark06=28052 : [url]http://www.3dmark.com/3dm06/17731211[/url] RE5_DX9_1280x800 = 237.5
  3. So I have made a good bit of progress in the last few weeks. I believe I have ironed out most of the kinks and have a reliable setup. I have also made some upgrades and designed and constructed a Mark 1 version of an enclosure Current Laptop Specs HP Zbook 15 G2 OS : Windows 8.1 Professional x64 Screen: 15 Inch 1080p eDP Internal Screen (Upgrading to IPS 1080p later this week) CPU : Intel i7 4710MQ (Overclocked + 200MHz to 3.7GHz Turbo Single Core using Intel XTU) RAM : 32.0 GB DDR3 1600MHz HDD : 256GB Crucial M500 SSD + 1.0 TB 7200rpm HDD in Optical Bay iGPU : Intel HD 4600 dGPU : AMD Firepro M5100 2.0GB GDDR5 eGPU Hardware: GPU : EVGA GTX 970 SC ACX 2.0 4.0GB GDDR5 Adapter : AKiTio Thunder2 PCIe Expansion Box + Molex to Barrel Adapter Connection : 16gbps x4.2 over Thunderbolt 2 Power Supply: HDPLEX 250w Pico PSU + Dell PA-9E 240w Laptop Power Adapter eGPU Enclosure: Custom built laser cut acrylic, first pass. Molex to Barrel Adapter: Preliminary Hardware Setup: Software Setup: I experimented a lot to get a working setup. As mentioned earlier, Windows 7 gave code 12, which to my knowledge has not been resolved as of yet. I made the jump to Windows 8.1, and everything seems to work plug and play, with the exception of Nvidia Optimus. Windows 8.1 is actually very nice with 'start is back' installed. Liking it so far. Starting from a clean installation of Windows 8.1 Pro, I first installed all HP drivers with no eGPU connected. The only exception was the HP provided driver for the M5100 and Intel HD4600. Optimus doesn't work if you use those drivers. Instead, I first installed the standard Intel HD 4600 drivers from the Intel website. Then, I used the standard Windows 8.1 drivers for the M5100, resulting in it being detected as a 'R9 270M'. I made sure not to install the catalyst control suite, as that seemed to interfere with optimus. Afterwards, I disabled my dGPU in device manager by disabling the PCI bridge above it. This system topology can be accessed in device manager by selected 'view devices by connection'. The specific device was 'Intel® Xeon® processor E3-1200 v3/4th Gen Core processor PCI Express x16 Controller - 0C01'. Once I had this device disabled, I rebooted the computer, completely removing the AMD Firepro M5100 from the pci bus in windows. Afterwards, I grabbed the latest NVIDIA drivers from their website, turned on my eGPU, and plugged the thunderbolt cable in. After the device was initialized by Windows, I installed the NVIDIA drivers using the built in setup. After a reboot, everything seemed to be working, and as an added bonus, both optimus and Physx were working! Previously, the same card on a DELL M6500 with a built in Firepro M7820 had Physx disabled due to NVIDIA locking out the functionality if an AMD GPU was present. No such issue with this computer owing to the built in Intel iGPU. Optimus Working: When the eGPU is disconnected and I take the computer to my office, I just re-enable the pci express controller, and I am able to use my built in GPU. Ideally, I'd like for it to be detected as a M5100 as the Firepro drivers would be useful for CAD, but one can't have everything. The eGPU still works if the dGPU is active, but I can only accelerate the external screen, as optimus will not activate with the AMD dGPU on the PCI bus. I don't really need to use the internal screen all that often as I have a 23 inch monitor, but it is useful for traveling, which I do often. That brings me to my next point. With everything working, I set out to make a good looking, functional enclosure that would provide adequate ventilation while being as compact as physically possible. I have access to a laser cutter, and I purchased some acrylic to get started. I wanted to also make as much use of the provided AKiTio enclosure as possible, as that seemed to provide the best protection for the PCB. The problem is that it is too short for a standard GTX 970, which is remedied with a hacksaw. The AKiTio enclosure (or what remains of it), is attached to the acryclic by screwing M3 screws through the bottom into the other side of the PCB standoffs. Build pics to follow. Enclosure Design for Laser Cutting: Laser Cutting: Cutting off the End of the AKiTiO Enclosure with a Hacksaw: This was not very easy and involved a lot of manual labor. My advice is to wear gloves and get extra hacksaw blades if necessary. First, cut off the small end pice connecting the back panel of the AKiTio enclosure. Then, bend the back panel away and begin your cut. Being patient helps a lot. After some effort, victory is mine! Preparing and Installing the AKiTio PCB Since I was already voiding my warranty, I decided to solder 12V power directly to the AKiTio PCB: Threading L-brackets for attachment of a cover later: It just so happened that these corner braces already had 1/4 inch holes. I used a tap and drill to cut 20 threads per inch into the steel: Assembly Time: It's pretty cramped inside the case, closeup view of the HDPLEX PSU Board included. I had to break my initial attempt at a front panel because the holes I put in for the TB cable were not big enough (the cable is thicker than the connector by a millimeter or so). Preliminary Testing without Cover (Thermal Stress Test): I set the eGPU up running Uniengine Valley overnight to make sure it was stable. Seemed to check out. Max temps were similar to what i saw in open air testing. Not too concered here. Adding a cover: I initially only cut a rectangular cover, but I ended up drilling holes for the mounting bracket screws and vent holes. This was pretty time consuming. For the vent holes, I simply drew a rough grid on the back using a sharpie, and drilled pilot holes through those points. I then went on the other side with a larger drill bit for the actual holes. Drilling acrylic requires care to avoid cracking. I think if I do this again, I will simply include the holes in the laser cutter file, as that is really the only way to get them perfect without chipping. Pictured next to the laptop for size. All Hooked Up: Optimus Active: In the future I'l like to build a better eGPU case, perhaps one with room for the laptop psu inside of it and a hinge. The idea would be for the laptop to sit on top of the case. That would also allow me to put a 120mm fan with a bottom intake in, allowing for better airflow while the laptop is perched on top. Got some more acrylic coming in and will give it a go when I have some downtime.
  4. Hi all, I recently purchased a HP Zbook 15 G2 with the following specs: OS: Windows 7 x64 and Windows 8.1 Pro x64 CPU: i7 4710MQ RAM: 8.0 GB DDR3 (Will swap out to 32GB DDR3) iGPU: Intel HD 4600 dGPU: AMD Firepro M5100 Ports of Interest: TB2, Express Card 54 eGPU Hardware: -EVGA GTX 970 SC -AKiTio Thunder2 Pci Express Expansion Box -Powered PCI Express x16 Riser -Corsair CX430 ATX PSU, powered on by SWEX -Soldered Molex to Barrel Adapter for AKiTio Expansion Chassis So I connected everything together, ensuring that my soldered molex to barrel adapter was wired correctly, and got a code 12 on Windows 7. I have not been able to overcome the code 12 issue in my limited testing. After being advised by Nando to switch to Windows 8.1, the eGPU worked perfectly via plug and play. It was not necessary to uninstall the dGPU drivers. The eGPU appears to be working properly in Windows 8.1 so far, giving me a score of about 28400 in 3DMark06. However, in running the Heaven 4.0 benchmark in extreme mode, I got some black screen disruptions. I think the eGPU was disconnecting itself, as I heard the Windows new hardware ping when the benchmark resumed. Amazingly, it didn't crash, but there still must be some instability there. I will note that the powered riser is currently not connected to the PSU for a lack of molex connectors available, I will try connecting it once I acquire a molex splitter. Also, the Heaven 4.0 score at 1080p Extreme was somewhat lower than what I have seen for this card running at 8gbps x4.1 with a slower cpu (1066 vs 1200). The 1200 score was on Windows 7 though, and others have noted lower benchmark scores on Windows 8.1 in general. I have not been able to enable Nvidia Optimus yet for rendering on the internal display. So far, disabling the built in dGPU results in code 43 on the Intel HD4600. I may have to try completely uninstalling the AMD software / iGPU drivers and starting with iGPU drivers straight from intel. I tried running a CUDA Bandwidth Test and Obtained the Following Numbers: HtoD: ~1250 MiB/s DtoH: ~1350 MiB/s The above would seem to indicate that I am achieving an x4.2 link. GPUz indicates that as well. I will be updating this thread as I have more time to test. Would like to make this an integrated box with an SFX PSU. Pictures of the setup can be found at the following dropbox link until I can size them down somewhat:
  5. I found a link with possible pricing information on the W541. Not too cheap. Hoping that actual direct to consumer pricing will be a little lower. Search - PC Connection
  6. So the principal issue involved in me getting the w541 is the timing. I have no idea when it will be released, nor do my university IT folks who purchase through Lenovo. The reason I am in the market for a new computer is that my Dell M6500 which I had been using in x4 mode with the PE4H and GTX 970 had its motherboard burn out. It was performing on par with TB1 speeds, as verified by CudaZ, but was a mess of wires. I'm not sure what happened, but something must have gotten bumped in the wrong way while the computer was on. I thought it was just a bad connection, but proper and careful reconnection also burned out my back up motherboard. I'm typing this on my backup laptop, and a new motherboard will be here on Friday.... I've already ordered an Akitio TB2 Pcie expansion box and powered riser cable. I have a corsair CX430 I'll be using to power it all. So I'm more or less invested in buying a new laptop that supports thunderbolt, but it also oriented for Windows. I get a discount on the w540 through my university for a total of 32% off until tomorrow, resulting in a price of $950 for the starting config. I already have 32GB of DDR3, and the w540 has 4 DIMM slots to drop the RAM into. Also, it has an optical bay slot I can use for a second hard drive. Plus, it supports much faster cpus than the M3800 (one option). I believe you can go up to a i7 4930mx for an additional $600 or so including the 32% discount, which if there is no throttling, is fantastic due to the unlocked multiplier and throttlestop. I also want to run fully coupled numerical simulations for work on this computer when I want, so that aspect is important to me. In addition, another person in my lab just received a w540 as their main computer, so I may be trying it out with theirs once I receive my akitio box in a day or so. The M3800 on the other hand has full support for TB2 instead of TB1 and is available right now. It has the advantage of touch screen, much much lighter/thinner profile, and aluminum instead of a full plastic body like the thinkpad. Unfortunately, it does not have a removable battery, though it is configurable with up to a 91whr battery. The 91whr battery option apparently removes the 1 option for a 2.5in HDD, so you would have to make do with the mSATA slot. No custom fancy docking station connector here, but there appears to be a nifty USB 3.0 docking station (though I'm not sure how that actually works with regards to video, etc.). The only disadvantage is the soldered in, slightly slower, processor, and 2 DIMM slots for a max of 16GB of RAM. My lab is in the process of buying me a pretty powerful workstation for the office, so the ram issue may be moot. The M3800 configuration I found was cheapest by swapping windows with linux for a savings of $100 (get windows through my university), and a 5% off coupon I found on the internet for precisions. Final price of $1450 (8GB of RAM, 500GB HDD). The w541 appears to combine the best of both worlds, but if it's not available soon (tried to email Lenovo sales, no response yet), I may have to pass for the M3800, so tempting to pull the trigger on an already available product. TL;DR: w540 vs M3800 = Battle between workstation vs. ultrabook, TB1 vs. TB2, 32GB RAM vs. 16GB RAM, Expandability vs. Ultimate Portability. Mainly care about TB2, Followed by CPU, Followed by Portability, Followed by RAM. Does anyone own one of these machines and can attest to its feasibility for egpu use?
  7. Any idea whether this laptop would enable Nvidia Optimus on the internal display over thunderbolt 2? I imagine you would have to disable the internal Nvidia Quadro by using setup 1.30 or similar. Need a new workstation laptop and already have an EVGA GTX 970. Trying to decide between this and the (much cheaper) Lenovo Thinkpad w540, which is on sale now for less than $1000.
  8. So I am chugging away at finally getting a successful egpu implementation that I am happy with. I currently have four laptops, one Dell M6500. The E6500 and M4400s will be given to my family back home as they need the computers, but in the meantime they have been useful testbeds. First the specifications of the M6500 Internal Display: 17" WUXGA CPU: i7920xm @2.00 GHz (I can take it way past 3.0 GHz, but leaving it stock for now to work on egpu)SSD HDD2: 1.0TB WD 7200RPM dGPU: AMD Firepro M7820 (Workstation equivalent of m5870) PCI Express Port Layout: Port 1: WWAN Port (Empty) Port 2: WLAN Port (Occupied by Wi-Fi card Normally, currently removed) Port 3: Bluetooth Port Port 4: Express Card Port 5: MSATA Slot???? (It's wired for pci express according to the motherboard schematic, but I was unable to detect anything plugged into that port). Port 6: Broadcom EthernetPort 7: Windows 7 Pro x64 Windows 8.1 x64 eGPUs Under Testing: eGPU1: Galaxy GTX 460 768mb eGPU2: EVGA GTX 970 ACX 2.0 eGPU Hardware: Adapter: PE4H v2.4b with 3PM3Ns and One EC Adapter for up to 8Gbps bandwidth (x4.1 Link) PSU: Corsair CX430 HDMI Cables: Cable Matters 3 Feet mHDMI to HDMI Cable + Cable Matters mHDMI Male to HDMI Female Enclosure: Laser cut particle board enclosure. Will fully assemble once eGPU Verified WorkingPM3N Enclosure: Old Dell Website, Latest Catalyst Driver Suite from AMD.Due to the high TOLUD = 3.5 GB, I had to do a DSDT override to allow for egpu. I followed some of the guides available, and used syntax similar to Avlan's DSDT Override. I actually modified his code so that I would achive a 56.25GB endpoint like everyone else. The old DSDT Override for RAM, so if you changed the laptop in standby, and connected the GTX 970 via express card only. Upon resume, Windows found new hardware. I canceled any driver installs and installed the latest drivers from Nvidia's website. It then prompted me to reboot.Upon rebooting, my issues began. Number one, I noticed that the Dell M6500 does not like the GTX 970 being connected upon bootup. If it sees the GTX 970 on bootup, it kicks the dGPU out of it's pci allocation in some weird quasi disabled state (still on the pci bus though), and puts the GTX 970 in its place (or so I think, haven't checked with 32bit mem map. The end result is that I boot with no Windows 7 Loads using the Vista style loading screen (no nice colorful Windows logo). Once I boot into Windows, I am greeted with a 16bit 640x480 screen courtesy of the code 12 error against my dGPU. This being Windows 7, I am unable to disable my dGPU and have the eGPU active at the same time (need laptop has three internal HDD slots (including the mSATA one), as well as an eSATA port on the Intel 4500MHD igpu. That worked fine with no conflicts, however pci express root port number three was rather flaky and would lead to spontaneous crashing. But I do know this video card does work with at least some Dell systems and can indeed run at an x4.1 link. Also I have noticed that when I boot into setup 1.30 with the GTX 970 connected, the status bar at the top right doesn't show how much free space there is available. It also doesn't show a pci allocation status yes* or no*. So something is being overrided by the eGPU, don't know about the software to say what.I have to solve this compaction issue of I want to do an x4.1 link on Windows 7 and the GTX 970 so far. Any ideas? Here are some benchmarks I did for the GTX 970 at x1.1 on an Express Card Link with the M6500TEST TYPEx1.1 PCIe Link ResultPCIe Speed Test Computer to Card Bandwidth, MB/s181PCIe Speed Test Card to Computer Bandwidth, MB/s199PCIe Speed Test Bidirectional Bandwidth, MB/s3313D Mark 06 3D Marks65113D Mark 06 SM 2.0 Score25083D Mark 06 SM.0 Score23093D Mark 06 CPU Score4837RE5 DX9 1280x800 Variable FPS92.5RE5 DX9 1280x800 Fixed FPS42.53D Mark Vantage Score238803D Mark Vantage Graphics280933D Mark Vantage CPU16470RE5 1280x800 DX10 Variable FPS122.2RE5 1280x800 DX10 Fixed FPS55.43d Mark 11 Score84893D Mark 11 Graphics97813D Mark 11 Physics6212Uniengine Heaven 4.0 Extreme Preset 720p Score1339Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS53.2Uniengine Heaven 4.0 Extreme Preset 720p Max FPS128.8Uniengine Heaven 4.0 Extreme Preset 720p Min FPS11.5Guild Wars FPS (Eye of North Outside)168COD MW2 (Opening, looking towards trainees)13DMC4 DX9 Scene 1 FPS109.62DMC4 DX9 Scene 2 FPS80.887DMC4 DX9 Scene 3 FPS154.01DMC4 DX9 Scene 4 FPS59.1Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS58.2- - - Updated - - - In this next post I wanted to document my experiences with my trusty old GTX 460. It has been running perfectly now for over three years in various eGPU setups and implementations (worked with all four of my laptops). Maybe this GTX 460 incorporates some kind of pci reset delay switch that makes it advantageous for eGPU detection / avoiding BIOS interference...Same system as above, still on flash drive for this bit of testing. Since the keyboard backlight doesn't work, ethernet is hit or miss, and there is a popping noise that sometimes comes from the speakers just before windows loads. I really want all those auxiliary devices to work as I still use it as a laptop for engineering work, so I purchased a new mainboard for pretty cheap on scanner. Would be useful in figuring out what exactly is failing... I'm currently preparing a Windows 8.1 image so that I can try using the GTX 970 with the dGPU disabled. It's pretty easy to swap OSs with Macrium Reflect, so I'll be doing that until I have some permanent, tested working solution (staying far far away from my actual critical work OS install until then).Oh yeah, benchmarks for the GTX 460 on the Dell M6500TEST TYPEx1.1 PCIe Link ResultPCIe Speed Test Computer to Card Bandwidth, MB/s164PCIe Speed Test Card to Computer Bandwidth, MB/s198PCIe Speed Test Bidirectional Bandwidth, MB/s1803D Mark 06 3D Marks53213D Mark 06 SM 2.0 Score20783D Mark 06 SM.0 Score19763D Mark 06 CPU Score2971RE5 DX9 1280x800 Variable FPS76.7RE5 DX9 1280x800 Fixed FPS38.83D Mark Vantage Score75313D Mark Vantage Graphics70033D Mark Vantage CPU9729RE5 1280x800 DX10 Variable FPS69.4RE5 1280x800 DX10 Fixed FPS40.43d Mark 11 Score24373D Mark 11 Graphics21873D Mark 11 Physics6201Uniengine Heaven 4.0 Extreme Preset 720p Score277Uniengine Heaven 4.0 Extreme Preset 720p Avg FPS11Uniengine Heaven 4.0 Extreme Preset 720p Max FPS7.5Uniengine Heaven 4.0 Extreme Preset 720p Min FPS32.8Guild Wars FPS (Eye of North Outside)148COD MW2 (Opening, looking towards trainees)10DMC4 DX9 Scene 1 FPS103.07DMC4 DX9 Scene 2 FPS76.52DMC4 DX9 Scene 3 FPS141.37DMC4 DX9 Scene 4 FPS50.86Crysis 2 Adrenaline Benchmark Times Square DX 11 FPS17.1- - - Updated - - - Still trying to compact with my GTX 970 attached, but I believe I am gaining some insight into the problem. I really need help here though, and I haven't found solutions by searching. First, even when compaction freezes and doesn't bring me back to the status window, setup 1.30 saves pci.bat. I can boot into setup 1.30, this time selecting the dos prompt, and call pci.bat. What's interesting is that it...actually runs without issue. The issue seems to be that the C: drive (usb stick) has fallen off of it's mount. I can still access E: (my windows drive), but there seems to be no C: access. I thought no problem, let me chainload using mbr since maybe the BIOS renumbered the HDDs. So I type 'call chainload mbr noremap'. That didn't work, so I then tried 'call chainload bootmgr noremap'. Also didn't work. Tried 'call chainload bootmgr'. Also didn't work.This problem seems entirely specific to having the 970 attached. If I leave the 970 attached and powered on before boot, I don't see the dell post screen, but setup 1.30 works fine. Can't do a compact without freezing (once again, due to USB drive since it's unaddressable / dismounted).I'm really not sure why this is happening. Booting up with the Galaxy GTX 460 causes no problems whatsoever. The machine posts with the Dell logo properly, and then continues to setup 1.30. There, I can do a compaction on 'dGPU eGPU', putting the egpu into 36bit space and forcing the dGPU to 32bit. Booting into Windows 7 works. All of this from the same USB stick. I did some initial digging, and found that the video card status display using nvflash works for the GTX 460, but not for the GTX 970. It simply says 'no Nvidia gpus detected'. This error message is in spite of the fact that the egpu is in fact detected and sitting on port 4 with the correct hardware id. I followed the isntructions for creating a loading a pcidump bin file to put the device on the pcibus using r-w everything, but it still doesn't show up in nvflash. I assume that this is because the version of nvflash included doesn't know about the GTX 970 series since it's so new.To that end, I have run the compact fail diagnostic. I am attaching the diag.zip file to this post if anyone cares to take a look at it. If you have a Dell Core 2 Duo or First Gen i7 business
  9. So I just wanted to provide an update. I am currently testing an EVGA GTX 970 with my M4400. As I suspected, there is no way (that I have discovered) to install the r343 drivers for the GTX 970 as well as the r340 drivers for the Quadro FX 1700m (DX10 GPU at the same time). This problem is the single greatest hurdle to the egpu setup with the GTX 970.I also have a DELL E6500 with a x9100 cpu and integrated intel graphics, so I have been testing with that configuration. I plan to make this an x1.opt setup with the GTX460, but in the meantime have been using it for x1 and x4 testing of the GTX 970. However, I am having issues with one of the mpcie ports / riser cables (port #3). It returns errors and generally offers less performance than all the other ports. Want to try this again with shorter ribbon cables later.I'm kind of wondering if I should swap the GTX 970 back for the R9 290. The GTX 970 with the Dell E6500 is great. It doesn't even require pcie compaction (DSDT override), just port switching to enable x4.1. However, due to the aforementioned instability issue on port 3, it's still not working perfectly. The GTX 970 has superior noise and thermal characteristics as well as more advanced driver support and also doesn't require wonky hotplugging of mpcie ports. I am also making progress on the enclosure. I have purchased a 4'x2' sheet of cheap particle board from which I will make a prototype enclosure. I have drawn up the cutout in a Google Sketchup file for use by the laser cutter at my school. The case is designed to be roughly the same planform area as the laptop, with the laptop sitting on top.I have updated my build log dropbox link with additional pictures. The circuit board you see next to the GTX 460 on the desk is a MT6820-b that takes an external GPU input and converts it into LVDS (same signal used by laptop internal display). Found at:
  10. I am in the middle of some benchmarks now. The benchmarked games are pretty much what you saw above. I have completed x1.1 testing with the R9 290 and am moving through x4.1 testing now. I got much smaller performance increases in moving from x1.1 to x4.1 with the 290 as opposed to the GTX 460. I've been looking for a working utility to try and test the real throughput of the AMD card's pcie interface, but to no avail. Pciespeedtest.exe doesn't work on my system for some reason. I am also seeing some relatively low DX9 performance (slower than the GTX 460 in some games). Not really sure what is going on here. Maybe the R9 290 doesn't think those games are quite worth it's time and it never comes out of downclocked mode? Let me know if you want more benchmarks in the meantime. I will update when complete. My 3d Mark 11 GPU scores are 9661 and 12836 for x1.1 and x4.1 respectively.
  11. So I got the R9 290 to run in x4 mode with the following procedure: PERST and CLCKRUN delays all set to 0. Boot with mHDMI Connected to port 1 into setup 1.30 Change link width on port 1 to be x4.1 Connect Ports 2, 3, and 4 mHDMI cables Send hot reset to port 1 to register the x4.1 link Perform pci compaction, assigning egpu to 36bit space Chainload into Windows Unfortunately, this is not what I was going for. I was hoping that it would simply be a push of the laptop power button as it is with the GTX 460. I really am interested in trying the GTX 970, but don't know about concurrent driver support for my Quadro FX 1700m. If I have the other mHDMI cables connected at boot, the R9 290 spins up to maximum fan speed (very loud) and eludes detection even after setting the x4.1 link and rebooting. The GTX 460 on the other hand does not exhibit this behavior. Any idea why the Radeon is doing what it's doing? In both cases, if all four mHDMI cables are left plugged in, the eGPU will not be detected on the first try. For the Nvidia GTX460, it simply requires setting the x4.1 link speed and rebooting. As mentioned above, this procedure does not work for the R9 290.
  12. I have a question then. All of my PM3Ns are version 1.2, which has the CLCKRUN delay switch. What event is this delay relative to? From looking at the pci express pinout, each lane after one has a refclock in addition to two tx/rx differential pairs. So I guess CLCKRUN is really needed. The question is, how do I make everything start in sync? - - - Updated - - - Actually, nevermind. It seems that http://pinouts.ru/Slots/pci_express_pinout.shtml gives a better explanation. I believe that that small portion after the key is still the part of the first lane. All other lanes do not rely on their own independent CLCKRUN signal. Thus, it seems it would be better for me set the CLCKRUN delay on ports 2 and 4 such that an x4.1 link is set before they have a chance to mess up detection. Will try after I'm done benchmarking at x1 speeds.
  13. Nando, I actually went and bought a Diamond R9 290 today. Testing it out now. I also uploaded some clearer pics to the dropbox folder. So far, I can only get the R9 290 to detect and work on an x1 link using port 1. When I connect all four cables, I don't have any hangup issues during POST, but the fans spin to maximum speed and the card is not detected. I suspect all this has something to do with the PM3N's CLCKRUN delay switches. I will try switching ports 2 and 3 to have the maximum delay in the hopes of setting an x4.1 link before the CLCKRUN signal conflicts. An alternative may be to tape the offending pins, although that is something I would rather avoid. The x1.1 R9 290 results look very promising. About twice the performance of the x4.1 GTX 460 so far. I am using same eGPU hardware I was using with the Latitude E6500. I managed to fry the mobo with the dGPU somehow, so I bought a cheap mobo with the Intel x4500MHD igpu so that I can use my spare PE4L to do an x1.1opt link. It should be a good backup machine for family / guests (have all adapters and will use XBOX360PSU).
  14. So I've been working on this egpu build off and on as a hobby, and I'm getting close to finishing. I have two Dell M4400's and a Dell E6500. This thread will focus on my quest to make a clean x4.1 PE4H setup with one of the Dell M4400s. In addition, I hope to perform some electrical wizardry to create a switchable micro HDMI input for the internal LCD. Notebook: 15" Dell Precision M4400CPU: Intel Core 2 Quad QX9300 @ 3.06 GHz RAM: 8.0 GB DDR2 800 Mhz Internal LCD: 1920x1200 WUXGA 2CCFL dGPU: Nvidia Quadro FX1700m BIOS: Revision A19 OS: Windows 7 Professional x64eGPU gearPE4H 2.4a 1x Express Card Adapter with 60cm Flat mHDMI cable 3x PM3N 3x Cable Matters mHDMI to HDMI Female adapters 3x Cable Matters mHDMI to HDMI Cable (3 feet long each) eGPU Setup 1.30 softwareThe port layout of the Dell M4400 (and Dell E6500) is as followsPort 1: WWAN mPCIe Port 2: WLAN mPCIe Port 3: Bluetooth mPCIe Port 4: Express CardThis port layout is great for egpus, and means we can attempt an x4.1 link. However, these systems have TOLUD = 3.5GB, and require a DSDT override for the eGPU to coexist with the dgpu (there is no igpu). I have performed such an override (my DSDT syntax was the same as reported by avlan) on the computer, extending the rootbridge to the 12.25GB + 4 GB endpoint (0x417FFFFFF). Using the 12.25GB endpoint in setup 1.3 works just fine for the memory allocation.InstallationSo far, I have managed to get the GTX 460 working at an x4.1 link. The main stumbling point was the pci compaction, which I will list below in case someone runs into the same issue. These steps assume you have already performed the DSDT override above.1. Connect 3 PM3Ns and 1 Express Card to PE4H. Plug in eGPU and switch PSU on.2. Boot computer normally and load Setup 1.30 (Only from disk image, USB stick install freezes after compaction currently)3. The eGPU won't be detected on the first boot, so set x4.1 link on port 1 and reboot using the menu4. After the reboot, go into setup 1.30 again, the egpu should be visible and at an x4.1 link. If not, a cable may be faulty or not connected well.5. Perform PCI compaction with the following options: Scope: All devices Endpoint: 12.25 GB Options: Close unused bridges, force32 dGPU6. Chainload to OS, should work.Once in Windows, go ahead and install the Nvidia GPU drivers. I used the 337.88 GeForce drivers for both GPUs. Keep in mind that with two NVIDIA GPUs, using different driver versions is a no go and will lead to instability issues. Since Nvidia has dropped support for DX10 gpus such as my FX1700m (Geforce 9700m GT) as of driver revision 340.52 , that means that upgrading to a GTX970 would prove troublesome as it requires driver revision 344. If anyone has any experience running a GTX 970 with a Nvidia DX10 dgpu let me know.The above covers the software setup for at least the GTX460. I was not able to achive x4.1 detection on the x1950pro (it did work on x1.1 using only one cable). I suspect that I will need to play with the delay switches on the port 2 and 3 PM3Ns as their CLKRUN signals may be interfering with port 1. Nando has informed me that only 4 TX / RX pins are used on ports 2 - 4 to make the 2nd, 3rd, and 4th lanes. Thus, if I can delay the CLKRUN signals on the Port2 and 3 PM3Ns, I may be able to set an x4.1 link on port 1 before the redundant CLKRUN signal is started. For those of you that have experience with this, please let me know if you have further suggestions. Will be trying this when I get more time.On to the physical setup. I experimented with many different adapter combinations, but I found the cablematters cables to be the absolute best. Everything else eventually gave me errors or resulted in slightly less bandwidth (indicating excessive retransmission). I'll cut it short here as this is where the majority of my time was spent. Each cable combination was tested using a CUDA based PCI express bandwidth checker and the Crysis 2 benchmark tools. Unstable links simply ended up blue screening before passing either of those tests. Port 1 is the most important port, and faulty links sometimes went unnoticed on other ports. I eliminated those by testing in x1 mode on each port independently. Only when the setup could pass that did I try an x4.1 link again. I have been using the cable setup mentioned above now for a little while, and it has passed numerous benchmarks flawlessly.I wanted to make this a clean, internal setup, so I went with the eGPU caddy approach. Essentially, I used the P22S-P22F extenders, and removed my laptop DVD drive. I then proceeded to remove the optical drive's guts so I could use the space for the P22S-P22Fs. By happy coincidence, the width available in the optical drive slot was almost exactly with the width of 3 P22s boards. To each of the circuit boards, I attached a mHDMI to HDMI female adapter, making the whole setup more easily pluggable without removing the bottom cover. Internally, the P22S-P22F board was connected to each mPCIe slot by way of a FFC ribbon cable provided in the kit (you can also buy these on digikey or a similar electronics shop).To make everything fit and look nice, I cut out a hole in the bottom of the optical drive casing with a dremel to enable easy connecting/disconnecting of the cables. In addition, I designed a caddy face solid model that I 3D printed at my University's 3D printing lab to make the face. I then screwed everthing together, and the result is a self contained eGPU connector optical drive module. It also has enough space for the circuitry I plan to include that would enable external input into the internal LCD.I have included pictures of the current setup in the following dropbox folder: [URL]https://www.dropbox.com/sh/r8j8scgafditgzh/AABggihY0gK2g7-HDFmDBwAOa?dl=0[/URL]Everything's a bit of a mess right now as I square everything away. But my plan is to make a laser cut self contained enclosure that will also double as a laptop stand. I am also looking into using some the CNC equipment around here to mill my own copper heatsink fan assembly with quick disconnect water cooling connections for dockable water cooling (still doing research on the fittings, but seems feasible using a cheap all in one cpu cooler and a concept similar to this link: [URL]https://www.youtube.com/watch?v=uhw1n1N0hD0[/URL]). Once I am done with the encloser, it should have the same planform area as the laptop and make all cables / power adapters stowable for easy portability in a backpack. The idea is to be able to pull this system out, plug one power cord into the wall, and connect the laptop using the internal LCD for a portable, powerful mobile workstation.I'm not quite done with a full scaling performance assesment, but the numbers look good so far, here is a preview that I will update as I get further along. I'm particularly interested in trying out the R9 290 as it has good performance at an x4.1 link. The best eGPU would be a GTX 970, but I think that is out due to the aforementioned conflicting Nvidia driver version issue. I will continue to update this thread as I get further along (look for another update involving the E6500 as well).Performance TEST TYPE Galaxy GTX 460 768MB x4.1 PCIe Link Result PCIe Speed Test Computer to Card Bandwidth 616 MB/s PCIe Speed Test Card to Computer Bandwidth 810 MB/s PCIe Speed Test Bidirectional Bandwidth 720 MB/s 3D Mark 06 13822 3D Marks 3D Mark 06 SM 2.0 Score 5874 3D Mark 06 SM.0 Score 5745 3D Mark 06 CPU Score 4340 RE5 DX9 Variable 98.0 FPS RE5 DX9 Fixed 49.8 FPS 3D Mark Vantage P11340 3D Mark Vantage Graphics 11132 3D Mark Vantage CPU 12017 RE5 DX10 Variable 98.2 FPS RE5 DX10 Fixed 48.9 FPS 3d Mark 11 P3144 3D Mark 11 Graphics 3031 3D Mark 11 Physics 3956 Uniengine Heaven 4.0 907 Uniengine Heaven 4.0 Avg FPS 36.0 FPS Uniengine Heaven 4.0 Max FPS 81.2 FPS Uniengine Heaven 4.0 Min FPS 16.5 FPS Guild Wars FPS (Eye of North Outside) 381 FPS COD MW2 (Opening, looking towards trainees) 30.0 FPS DMC4 DX9 Scene 1 227 FPS DMC4 DX9 Scene 2 178 FPS DMC4 DX9 Scene 3 275 FPS DMC4 DX9 Scene 4 142 FPS Crysis 2 DX11 Xtreme Times Square Adrenaline 26.0 FPS
  15. I apologize in advance for the length of this post, but I have already invested a lot of time in getting this up an running, and I wanted to provide as much info as possible for troubleshooting.So I have posted about my setup before. I am in the process of achieving an x4.1 link on my Dell e6500. So far, I have achieved an x2.1 link that seems stable, minus a few other issues. I have been unable to achieve an x4.1 link thus far as I don't have a reliable way to fit the bulky mHDMI connector in the half height mPCIe slot previously occupied by the wifi card (located here. I will provide an update once I receive it. In the meantime, I was able to achieve an x2.1 link with two PM3Ns connected to the half height WLAN slot and former WWAN port, but it was not easily repeatable (loose half height connection). In no case was I able to detect the egpu when all four PCIe slots were occupied.For the stable egpu configurations at x1.1 and x2.1, I have been piecing together some performance information. The stable x2.1 connection involves one express card connector connected to root port 4 and one PM3N connected to root port 3 (where the bluetooth card used to be).To verify my pci express connections, I ran the following video memory tester, which also has returned no errors. In addition, I am able to complete all 3D mark testing as well. The only tests I had problems with were the DMC4 tests, which seemed to hang on my card (no BSOD, though).Here are the eGPU video connections: eGPU DVI 1: Connected to 23inch Viewsonic monitor eGPU mHDMI: connected to 5 port hdmi switch with a 50 foot hdmi cable out of the room, and the switch is connected to a Casio XJ-a245v projector. Sound output is enabled for this reason.Depending on whether I select the PC as an input at the switch, the egpu switches from single 1080p monitor to cloned 720p monitor mode. All testing was completed in single monitor mode, so I don't think the other monitor connection is causing my problems, but just stating that upfront.In addition, I am running a Corsair CX430 power supply connected to a Kill A Watt power meter. In normal, idle use, the eGPU seems to use around 50 watts. When 3D mark 06, RE5 var, or similar demanding 3D application is being used, the power consumption at the outlet jumps to 110 watts. When furmark is running, the power consumption jumps to a whopping 210 watts. My PSU is rated to handle these power demands with ease, though it could maybe be causing something. The fact that it doesn't crash even during extended furmark sessions tells me the PSU is not the problem here. I am currently running the driver verifier included in windows 7 to ferret out any driver incompatibilities. For what its worth, COD MW2 is much more stable than Orcs Must Die 2, but I would like to play both of those games. I have also included some pics of my setup. Here is another, another.Without further ado, here are my current test results. DELL E6500 eGPU TESTING AT DIFFERENT PCIe Link Speeds OS: Windows 7 Professional x64 CPU: Intel x9100 C0 Stepping 3.458 GHz at 1.285 Vcore CPU COOLING: DELL Precision M4400 Copper Heatpipe Cooler Installed in e6500 RAM: 4.0GB DDR2 800 MHz, DSDT Override Enabled eGPU: Galaxy GTX460 768 MB eGPU Clocks: Core Clock 800 MHz, Memory Clock 1000 Mhz TEST TYPE x1.1 PCIe Link Result x2.1 PCIe Link Result PCIe Speed Test Computer to Card Bandwidth 163 MB/s 327 MB/s PCIe Speed Test Card to Computer Bandwidth 197 MB/s 291 MB/s PCIe Speed Test Bidirectional Bandwidth 177 MB/s 308 MB/s 3D Mark 06 5447 3D Marks 9198 3D Marks 3D Mark 06 SM 2.0 Score 2156 4000 3D Mark 06 SM.0 Score 1992 3640 3D Mark 06 CPU Score 3054 3042 RE5 DX9 Variable 75.0 FPS 88.0 FPS RE5 DX9 Fixed 37.7 FPS 44.5 fps 3D Mark Vantage P7283 P8911 3D Mark Vantage Graphics 7574 10155 3D Mark Vantage CPU 6531 6516 RE5 DX10 Variable 71.8 FPS 93.5 FPS RE5 DX10 Fixed 40.4 FPS 48.5 FPS 3d Mark 11 P2495 P2904 3D Mark 11 Graphics 2393 2980 3D Mark 11 Physics 2697 2695 Uniengine Heaven 4.0 390 594 Uniengine Heaven 4.0 Avg FPS 15.5 FPS 23.6 FPS Uniengine Heaven 4.0 Max FPS 52.0 FPS 43.9 FPS Uniengine Heaven 4.0 Min FPS 10.0 FPS 12.8 FPS Guild Wars FPS 87.0 FPS 131 FPS
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.