Jump to content

gz10

Registered User
  • Posts

    5
  • Joined

  • Last visited

Posts posted by gz10

  1. Just a heads-up for future reference in case anyone's doing a search on it: another source of Code 43 error is having the PCI-Express power cable not plugged into the card! Just got my new GTX 660 replacement in and fired her up and spent about 5 minutes trying to figure out why I was getting this error now when it worked with my other card. So yeah, don't be stupid like me and forget about the cable.

    - - - Updated - - -

    Just wanted to post my results with my eGPU setup and X230. Specs of my system are in my signature.

    19,496 3DMark 06 score with an external display:

    NVIDIA GeForce GTX 560 Ti video card benchmark result - Intel Core i5-3360M Processor,LENOVO 2306CTO score: 19496 3DMarks

    Now just need to get an enclosure for my system, any recommendations? I've been thinking about getting a cheap miniITX case to put it all in, to possibly boot the system by pressing the case button. Looks like the headers on the SWEX adapter are just regular old headers used to hookup PC switches. Anyone know if this is possible?

    I bought a Fractal Core 1000, but it hasn't arrived yet. My only concern was the cable length and if it will reach to where my laptop is in my setup. I believe it will with this case.

  2. Yeah looks like this is a new product: PE4H V3.2 (PCIe x16 Adapter)

    A couple of notes for why it may not change the product we'd have chosen (PE4L 2.1b):

    • You can only use video cards where the PCI-Express power connectors are on the end of the card rather than the side (the maority of high performance cards have them on the side -- my GTX 660, for example)
    • I don't believe there's any performance difference (still limited by the ExpressCard I believe, PCI-Express x1 2.0 -- someone correct me if I'm wrong)
    • The power adapter is only 120W. I'm unsure if that would be enough for many of the higher performing cards (even though Kepler is more power efficient). From the looks of it you could also just use a normal ATX power supply and connect the 24-pin connector to it, but IMO that kind of defeats the point of a nice enclosure when you have your PSU with all its wires out in the open
    • It's more expensive. You're getting more than just the PE4L, sure, but I paid $78 for the PE4L, $17AR for my PSU, and I'm thinking about a cheap mATX case to stick it all in -- ~$35). Plus this setup is more flexible

    It does look pretty nice though and is more reasonably priced than the ViDocks. Perhaps it would be a great fit for a card like the GTX 650/650Ti (PCI express connectors on the end, and possibly within the limits of the 120W adapter). For comparison, many people have used a 203W XBox power adapter (needs to be modded though) -- maybe hwtools will make a higher powered adapter available in the future

  3. Different system bios handle the startup differently if the eGPU is detected. My understanding is the latest X220 bios will just lower TOLUD and boot with the iGPU as the active primary video device. My Dell E6230 would set the eGPU as the primary video device on bootup if it was detected on poweron. To get x1.2Opt engaged, with the iGPU as the primary video device, the eGPU needed to be hot plugged or activated using the PCI Reset Delay jumpers so it was on the pci-e bus *after bootup*.

    Yes, definitely have the select preferred graphics processor option.

    What are the "PCI Reset Delay jumpers"? Is this the "SW1" on the PE4L? I was actually not sure what that should be set to. What exactly does this do? Add a delay to when it will register with the PCI-Express bus? I haven't had much luck with hot plugging or even switching it in during sleep, so I'll be starting from a cold boot. If I have the delay switch set to position 3 (6.9s) and turn on the eGPU first and then turn on my laptop, it should work with Optimus?

    I've noticed a slight 200-250 point difference in 3DMark2011 scores from time to time and I'm guessing it's just times when optimus is enabled vs disabled. Realistically though, optimus on/off probably doesn't make a huge difference given the x1 2.0 setup instead of 1.0. I also wonder if in CPU bound games (this is the case for me in Planetside 2 w/ my GTX 660) it's disadvantageous to do the Optimus compression (this assumes the processing is being done by the CPU).

  4. Just got my X220 eGPU setup running last night, initial results:

    Lenovo X220

    i7-2620m

    8GB RAM

    Windows 7

    eGPU: Galaxy GTX 660

    PE4L 2.1b (so, x1.2Opt)

    Corsair CX430 430W PSU

    Results (@ stock): Vantage -- P16364, GPU = 19978

    3DMark2011 -- P5101, GPU = 5474

    Looks like it's running well.

    One question -- how can I confirm that Optimus is working? I know in the NVidia control panel I can see the option for the 2 GPUs, but I know Nando on his ATI/NVidia comparison in a note mentioned: "Achieved by powering the eGPU before starting the system". Does this mean if I start my X220 from a shutdown state with the eGPU in there, it will not use Optimus? So the only way is to add in the eGPU resuming from sleep or hot swapping?

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.