Jump to content

daver160

Registered User
  • Posts

    140
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by daver160

  1. Ok that's the response I want.

    Only another question: both of them come with all the cabling I need for? To connect with ExpressCard I'm sure (all the game is on that) but also to connect with PSU and SWEX if needed?

    Yes, buying an eGPU kit will come with the following components:

    *"PCI board" - the board that the PCI card will plug into

    *SWEX switch - a separate board that accepts the PSU's 24-pin plug, and a physical switch to turn the eGPU kit on/off

    *ExpressCard adapter - this plugs into your Lenovo

    *data cable - connects the "PCI board" and the ExpressCard adapter

    Now, the thing to note is that the PCI board, data cable, and ExpressCard adapter are all soldered together into a single piece. The only piece that is physically separate is the SWEX board.

    Very true, and I thank you for that. My objection was to the use of x16 as I firmly believe it shouldn't be used when describing _any_ PE4L or PE4H setup (even BPlus are guilty of this). The PE4L has an open ended x1 slot; the PE4H 2.4 has a x16 slot but can be connected at most as an x4 (and beyond x2 is hard); the PE4H 3.2 has a x16 slot but is hard-wired as x1.

    Any mention of x16 suggests that you have something like the connector inside a regular PC and it's at best only physically the same connector.

    No, you are correct. However, monitoring software such as GPU-Z report that it's x16. I only used the full spec as reported by GPU-Z, because almost all software likely reports the same thing.

    Especially as davide445 has already stated that his eGPU setup is not for gaming (I'm not sure whether there's anything to be gained from an eGPU for davide445's usage, but that's not my place to answer - I simply have no knowledge of the use case to make card suggestions).

    That I don't find your nVidia/AMD pairing equivalent is irrelevant, but the choice should be nVidia-only if using the GPU to drive your laptop internal display. It's possible to use AMD, but there's more software setup to do.

    I'd also suggest using any 250W and upwards PSU will work, I just won't recommend anything low end :)

    Again, you're right, but I was making assumptions that he might be using the eGPU for video converting or CUDA crunching. Like you said, there's absolutely no need for an eGPU kit if you're not going to play games or run GPU-based data crunching.

    I only added AMD GPUs to the list as they are cheaper than their Nvidia counterparts; as well, AMD GPUs are much more abundant and you can get something like a 6450 for $25 here in Canada.

  2. Apart costs and apart case, my final concern its if there are more sense for a PE4L or PE4H.

    I read infinite debate but still I don't understood if it make for me any difference.

    If you ask me, it comes down to 2 factors:

    1. Do you want to build a custom case later on?

    2. Cost

    Performance-wise, they should be identical. The only difference should be whether or not you want to spend time, effort, and money into building an enclosure for it. The PE4H takes care of that for you (if you buy the optional aluminum case with it).

  3. First up, apologies as I'm not familiar with how VAT and duties are added to the total (I know how VAT works, but beyond that I'm clueless). Is there a specific website you will be ordering the PSU/case from? Just wondering, so that I could peruse the site and see if I could come up with some alternative choices (help save a few bucks if possible :) )

    Now, I personally think that the PE4L is the better package only in the sense it's cheaper (more bang for your buck) and that you have the flexibility of a custom case to fit your GPU and PSU. While the PE4H 3.2 is great in that it comes with a case already, it's entirely possible that the GTX 660 that you intend to purchase is too thick (thanks to the cooling fins and fans) for the PE4H case. I'm just worried about extraneous factors which you cannot control, and won't know about until it's too late!

    I'm not embarrassed to say it, but my first eGPU enclosure was a shoebox and bunch of popsicle sticks propping things upright. It worked marvelously until I got my hands on my current Shuttle enclosure. The only downside of a shoebox is that you cannot reliably stack heavy objects on top of it.

  4. Not quite. Firstly I'll explain what the non-Opt means:

    1.1 Means it's the same bandwidth as 1st generation PCIe x1 (PCIe 1.0a x1 or the equivalent PCIe 1.1 x1).

    1.2 Means it's the same bandwidth as 2nd generation PCIe x1 (PCIe 2.0 x1). This is twice the bandwith of PCIe 1.1 x1.

    The Opt means nVidia's optimus compression. This is detailed in the first post in Nando's DIY eGPU Experiences. Basically you get data compression between the CPU and GPU so get more performance out of the available bandwidth

    I was trying to keep it real simple. If one doesn't understand how PCI Express works, less words and numbers are usually easier to understand :)

    @davide445, don't think about the cost as being that high.

    My eGPU setup, altogether, cost me about $300 CAD after taxes and shipping. You just have to find a good deal on the individual pieces. I got both my GPU and PSU on sale.

    GPU: < $150

    PSU: < $50

    PE4H: $100 ($91 + shipping)

    Shuttle XPC case: Free, got it off local Craigslist

    You don't need an all-powerful GPU, unless you demand the most in your games, and you don't need a super fancy PSU with Bronze/Silver quality qualifications and guaranteed 12V1/12V2 rail outputs. Any standard 400W PSU will suffice (so long as you double check that the 12V rail outputs at least 12amps), and any GPU will be just fine. Just find a GPU that meets your performance expectations (e.g. if you want great performance, look at a GTX 670 or 7870 and higher; if you want to just get an improvement over buiilt-in GPU performance, look at a GTX 660 or 7770 and higher).

  5. Thanks. I was searching about Optimus but can't find any reference to 1.1 or 1.2, what does it mean? A spec of the architecture, a specific sw conf? Can you give me some hint?

    Think of it kind of like "how much PCI compression you get" out of Optimus.

    So 1.1Opt is basically "PCI-E 1.1x16 @ 1 1.1", meaning that you are getting PCI-E 1.1 support. You'll still get very good performance from the GPU, but you're not getting everything out of your GPU.

    1.2Opt is something like "PCI-E 2.0x16 @ 1 2.0" (I can't remember exactly the numbers), which means it's using more compression through Optmius thus providing more performance.

    It sounds like you're still a little bit confused about how the eGPU setup works. Basically, if you are considering doing an eGPU setup, you will need at least 3 components:

    *Power supply (PSU)

    *graphics card (GPU)

    *PE4L and one of the below--

    **EC___a adapter (if you have an ExpressCard slot on your laptop, then get this)

    **PM___a adapter (if you have only a mini PCI-E slot in the underside of your laptop, then get this)

    The way the eGPU is physically setup, is like below:

    1. Plug the GPU into the PE4L board

    2. Plug the power supply into the GPU (if necessary, this depends on the GPU itself)

    3. Plug the power supply into the PE4L board (required)

    4. Plug the power supply into the PE4L SWEX power switch board (required, SWEX board comes with PE4L package)

    5. Plug the PM___a/EC___a adapter into the laptop

    The rest of the setup is software. You might need tinker a bit to get everything to work just right, but essentially this is all that is necessary. This method is basically the most user-friendly. Everything fits together like LEGO, so it's hard to go wrong. As well, although this seems like a relatively daunting DIY project, it is really quite safe. You cannot do any major physical damage to your laptop with the eGPU kit (the only physical damage you can do is potentially damage your mini PCI-E port in your laptop), as really it's all mostly a software thing. You may have seen the name "Setup 1.x" throughout this eGPU sub-forum, this is software used to disable some built-in GPUs in laptops, as well as do some special PCI compaction in order to make room for an eGPU. Since you have a X220, you definitely don't need Setup 1.x, in case you were wondering.

  6. About this 1.1Opt may I ask you what exactly is? A sw version, a benchmark?

    I can't find a single Shuttle XPC case, what model are you using?

    "1.1Opt" or "1.2Opt" refers to PCI compression through the Nvidia Optimus.

    It's not really "software" per se, but rather the method that the Nvidia GPU sends video data to the laptop. 1.1Opt provides less performance than 1.2Opt.

    The Shuttle XPC case I'm using is a very old one. The precise model number is the Shuttle XPC SN95G5. I just used the case, as you can see in the photos I uploaded. It's larger than the mini ITX style case, but due to its larger dimensions I'm able to stack stuff on top of it easily and safely.

    Before I started using the SHuttle case, I was literally using a (New Balance) shoebox.

  7. From reports in the main eGPU thread, it seems that a 650 isn't bandwidth limited across even a PCI express 1.0 x1 connection (!), but cards above that certainly are. That does not mean that a 680 performs the same as a 650.

    I can confirm this, but not totally accurate: there is a little bit of bandwidth limitation.

    I'm running a GTX 650 Ti in my eGPU at 1.1Opt, and I'm getting about 90% of the GPU's full power. However, when I pull it out and drop it into my desktop, I'm seeing a definite framerate increase in the same games (e.g. Metro 2033, L4D2), despite my desktop only being C2D E8400 and my laptop being i7 2760QM. As well, my GPU monitors (MSI Afterburner) show that in my eGPU my 650 Ti never goes above 80-90% load, whereas on my desktop I've seen it spike to 100%.

    So while my desktop gets better performance out of the GPU, we're really only talking about something like 5-10 frames more than my laptop, and when you're already running a game at 60-80fps, that's not really that much in the end. I just wanted to toss that out there.

    Ok that mean I need to look at case with two PCI/expansion slot (right?)

    The video card will powered from the case PSU if I put it in case PCI slot? I assume the PCIe power connector is integrated into case expansion slot where I insert the card (that on the right of the photo), or I'm totally wrong?

    So I need to connect the PSU also to PE4L, but normally integrated PSU does have a spare power connector to do so?

    Or I'm totally wrong and the case it's just a box that only contain all the things, but none of his "services" (ie powered PCIe slot) are used from a eGPU setup, except from the PSU itself?

    If so really I don't see the point using one of this case, apart from the finishing.

    FWIW, I have my eGPU housed in a Shuttle XPC case. It's not as small and elegant as a mini-ITX, but I found that it's more convenient due to the size: I can stack my secondary monitor (19") monitor on top of it, and it's now top-aligned with my primary monitor.

    In my thread here, I have some pictures of my enclosure. As you can see, it's larger than a mini ITX case, but not by too much. And because of the larger case, I have lots of empty room inside that allows me to stow away the cables (PSU power, DVI), making it really easy to just pack up and go.

    • Thumbs Up 1
  8. But do you think that 500watts will be enough for a 560ti?

    Thanks!

    It should be, but you can just check what the output is on the 12V rail. It should at least be 12amps. I would imagine that only the most power hungry GTX x80/x90 GPUs would require something like 20amps on the 12V rail.

  9. @daver160 - Is there a PE4H 2.4 with a hardwired cable? Any board with a detachable cable isn't going to work for for PCI-2.0. If you don't like the format of the PE4H 3.2 then I would have thought that the PE4L 2.1b would be ideal.

    Not that I know of, all the known PE4H 2.4 kits have detachable cables.

    It's not that I dislike the format of the PE4H v3.2, it's just that HIT doesn't carry it right now. So given my situation, I have 2 choices: wait for the PE4H PCI-2.0 compliant kit to arrive at HIT, or just go with the PE4L which we all know is PCI-2.0 compliant.

    The more I think about it, the more I'll probably just go with the PE4L. I'll see no major performance gain by waiting for a newer PE4H kit, and frankly I don't think my 650 Ti will be able to max out any new PCI 3.0 that may come out soon.

  10. Anybody heard from HIT or other vendors whether or not the elusive PE4H PCI-2.0 compliant kit is on the way, or still in the discovery process? It sounds like the PE4H v3.2 kit is probably "it", though nobody has really confirmed definitively that this is the case.

    I'm just wondering if I should keep holding out for the upcoming PE4H PCI 2.0 compliant kit, or if I should just go for the PE4L v2.1. The guys at HIT have already been so accommodating to allow me to get a full refund of my PE4H kit well outside of the standard 30-day period. I just don't want to be a butt head about this and do a refund 6 months outside of their normal 30-day refund/exchange period (it just doesn't feel right to me).

  11. Congratulations! I'm sure that though you are not getting full performance from the egpu, it is still much better than the gt420 that you have.

    You'll have to let us know ow how well its working for you, such as benchmarks, if you have the time to do so. Also, it would be fantastic if you could create a thread, similar to this and the others, describing how you got your egpu to work. That way anyone else with a L501x can follow your steps to get it to work!

  12. Hello, I am totally new to this eGPU concept and may be also lack the skills to do all of the work but I still want to have a go with it.

    Dell XPS L501x

    i5 M460 2.5GHz

    8G RAM

    GeForce GT 420M

    My eGPU is a GTX 550 ti 1GB.

    Just going to throw out there that though yours is the L501x, I don't think there would be much of a difference in getting it to work.

    I didn't follow your exact instruction but use simple procedure without using the Setup 1.x. Everything works fine, i got the Laptop to recognized the eGPU and install driver directly from Device Manager. Now in it shows in Device manager that the GTX 550 ti is working properly without any error.

    What specifically did you do to get this to work? What were you steps? Did you just assemble the eGPU, plug it in to your machine and power it on?

    What eGPU kit do you have? PE4H or PE4L?

    The eGPU will always be detected by Windows; whether or not you get Errors depends on certain steps for different computer models. In this case, please share what you did to NOT get an Error 12 upon simply assembling and plugging in your eGPU kit - this would be extremely interesting to hear.

    When I connect it to a external monitor and reboot, it didn't show on the external monitor. I tried to change the main display, it shows that there is available display output on the GTX 550 Ti but no display detected. I figured that it could be because i didn't turn off the dGPU as the GT 420M still appear in Device Manager so I installed Setup 1.x and give it a try. I cam across the same issue you had when trying to turn off the dGPU, tried the troublesooting with startup.bat but nothing works.

    From reading this thread, I edit the startup.bat just like yours fix, reboot then run startup.bat script and it just stuck there. Did I mess up the whole thing when installing the eGPU driver without using Setup 1.x in the first place ? It seems right to me since the eGPU is showing without any errors in Device Manager.

    So this confirms my suspicion that while your eGPU was detected by Windows, it wasn't really working. Because we have iGPU + dGPU, we must disable the dGPU in order for the eGPU to take its place and take over the primary display.

    As for the startup.bat problem, this might be due to the hardware ID not matching. Because your dGPU is not exactly the same as mine, your hardware ID might be different. So for the line in startup.bat that says

    call vidwait 60 10de:11c6
    call vidinit -d 10de:11c6

    Make sure you change the "10de:11c6" to whatever your eGPU hardware ID is reported by Setup 1.x (it shows up in the top right box when it first loads). You didn't mention if you've done this, so I'm assuming that you have not, and also that your hardware ID is different (it could be the same, but I'm assuming otherwise just in case).

    Ok, so I tried uninstall the GT 420M driver completely from Device Manager, try Setup 1.x again, the problem is not there anymore. But in Device Manager now, there appear to have 3 display adapters, one is Intel HD Graphics and the other two is Standard VGA Graphics Adapter ?! Btw, I tried to set the pci_alloc_valid to yes, but when I chainload it keep appearing that I havent done so.

    If Setup 1.x worked properly, you should have only 2 displays (Intel HD iGPU and GTX 550 Ti eGPU. In this case, it looks like the dGPU was not properly disabled by Setup 1.x. As for the "Standard VGA Adapter" names, that's just because you currently do not have any Nvidia drivers installed.

    Could you please download and install the driver for GTX 550 Ti, and then try again? Please try to do this while your eGPU is connected, but without running Setup 1.x (this driver will detect both Nvidia cards).

    As for Chainloading, it will work if Setup 1.x successfully disables dGPU and re-allocates PCI space for the eGPU. If any of those steps failed, then Windows will boot up and detect the Intel HD and both dGPU and eGPU like you see now.

  13. I think I did, I can not see it in the device manager manu, or there are something else I need to do? And the Windows can not see the eGPU at all.

    OK, well to me step 1 should be making sure that the eGPU itself is working. The yellow LED on the PE4L probably means something is not working correctly.

    Even without disabling the dGPU, Windows should still detect the eGPU by displaying your graphics card in Windows Device Manager (e.g. "NVIDIA GTX 560 Ti") but show a yellow exclamation mark indicator for the Error 12. Since you are not seeing this in Windows, it means that your eGPU itself is not properly configured.

    How have you connected your eGPU kit?

    The PSU should be connected to the PE4L board (4-pin floppy), SWEX board (24-pin ATX), and the GPU (6-pin PCI, you should need 2 cables for your GTX 560). Then the PE4L should be plugged into the mPCI-e card on the underside of your laptop. With this, powering on the eGPU kit should display all green LEDs. Yellow means something is probably not plugged in properly; Windows' not detecting your eGPU at all (nothing in Device Manager) confirms that.

  14. Hi, I'm having a problem. My L502X can not detect my egpu(gtx 560ti).

    What I did is uninstall my gt525m and installing my egpu, I think I installed it in right way, both fans of egpu and psu are working and there is a green LED and a yellow LED appear on my PE4L.

    Then, after I restart my laptop, it can only detect the gt525m and install its drive automatically.

    Please help.

    Did you disable the GT525M? You need to disable it for Windows to properly detect and use the eGPU. Also, does Windows see the eGPU at all?

  15. I have a 650Ti and the latest PE4L on my X220T and I am very happy with my performance. I will be making a thread like yours when I finish it all.

    Howver, my 3Dmark score is lower than yours. I'm guessing this is due to the slower processor however, I can run the heaven benchmark later at the same settings as you if you'd like a comparison.

    Indeed, I'd expect that you'd likely have better scores in true GPU-only benchmarks. My CPU definitely gives me an advantage anywhere the CPU is used, but the 1.2Opt you're running on probably reduces any major advantages my CPU provides. In day-to-day gaming though, I would imagine that you're getting better performance that I do!

    I think it would be really great if you created a similar thread as myself and others have, so that future eGPU DIY-ers with your same laptop model can follow along with what you've done to get up and running.

    While we're not pioneers, we should definitely be town criers!

  16. Hi all,

    Quick show of hands: who thinks I should forego waiting for the PCI 2.0 compliant PE4H from HIT and just get the current PE4L instead?

    I'm not looking for top of the line performance, but I would like the 1.2Opt that comes with Gen2 compatibility. I'm not in any rush, and HIT has been great in still honouring the discussed plan of refunding the PE4H and getting the PE4L. I just don't want to have to wait a whole year.

    So does anybody think it's still worth waiting for the new PE4H kit? I'd love to test it out and let people know how it is (and of course benchmark the digital snot out of it), but I just don't know WHEN that will be. It could be next year for all I know!

  17. I'm at work right now, so need to be brief, but I'll address these questions best as I can for now. I'll go into further detail when I get home and can remember what I did

    2. Installing and setting up eGPU components it basically worked too, but if I boot up my notebook with the egpu powered up and plugged in. It doesn't start. I just got a black screen. It just works if I plug in the egpu in sleep mode.

    I've actually not been hotplugging it while Windows is in sleep mode. My Method is to have the eGPU off, boot up the machine, boot into Setup 1.x, then power on the eGPU.

    I, too, have the weird issue where I g et a black screen with the eGPU powered on and plugged in (if I don't boiot and chainload from Setup 1.x). I found that turning on and plugging in the eGPU after booting fixes that issue.

    3. Running Setup 1.x Didn't work at all. I had the same problem like daver160. It just freezes if I try to disable the dgpu. So I tried to switch out the startup.bat and then I did not now how to go on.

    Have you tested to make sure that Windows detects your eGPU?

    Power on your laptop, then power on and plug in your eGPU, and boot into Windows normally. Your Device Manager should detect your eGPU so that you have 3 display adapters. Your eGPU GTX 660 should be recognised by have an Error 12, which is a good sign.

    Now, as for the startup.bat, not sure if it was just a typo, but make sure each call is on its own line, like so:

    setpci -s 1:0.0 COMMAND=0:7 10.l=0,0,0,0,0,0 -s 0:1.0 b0.w=10:10 19.b=0,0 3E.w=0:8 COMMAND=0:7 20.l=0,0
    call iportbus force
    call iport g2 1
    call vidwait 60 10de:11c0
    call vidinit -d 10de:11c0
    call pci
    call chainload mbr

    Don't forget to make sure that your "10de:11c0" is in fact the one listed in the Setup 1.x screen. For example, when you boot up into Setup 1.x, it will detect your iGPU, dGPU, and eGPU. Make sure that you are using the address ID that Setup 1.x sees for your eGPU otherwise it'll be looking for something that's just not there!

  18. It might work if you can get Win7 to allocate the eGPU and you hotplug in to overcome mPCIe whitelisting issues. However, the gt540M would be assigned the Optimus features, so the gtx660 would run in x1-only mode. There you'd miss out on Optimus internal LCD screen mode AND pci-e x1 compression which *greatly* accelerates DX9 and somewhat DX10. To get the greatly desired performance boost requires the gt540m to be disabled.

    From my PM discussion with daver160 we found the neither the stock nor modified bios could disable the dPGU. The modified one does give a PEG option which didn't do anything. So daver160 resorted to using Setup 1.x to successfully disable the dGPU.

    Setup 1.x, when automated, presents as a Win7 bootmenu item so when you want the eGPU you just hit that item, it does it's thing and chainloads back to the Win7 bootmenu where you select Win7. It adds like 1.5s to the whole bootup time.

    As Nando already stated, our BIOS, even the modified ones by capitankasar (from NBR) cannot actually disable the dGPU. The GT540M is basically on "all the time", in the BIOS we can only specify whether we want to boot Windows with the iGPU or the dGPU.

    There is a setting for changing the TOLUD value in the modified BIOS, but Windows 7 (haven't tried Win 8) does not respect this value. For example, by default my TOLUD value was set to 3GB in the BIOS, and Windows 7 recognises this (as evidenced in the Device Manager). However, changing the TOLUD in the BIOS to values like 2.5GB and 3.5 GB did not show in Windows 7; the device manager in Windows 7 always reported a value of 3GB.

    Without any tinkering, I don't think one with our laptop models (L501/L502) would be able to get the eGPU working without first disabling the dGPU. At least for me, I absolutely have to use Setup 1.x to get my eGPU going.

    • Thumbs Up 1
  19. Thanks for the update Hunter20 ! I was about to do the same thing (email BPlus about the KZ-B22) but luckily I checked the forum here first. I guess those of us who want pcie 2.0 support have no choice but to use the eGPU kit 'as-is'. Bummer...

    Not saying that this is a great idea, but nobody said you couldn't extend the cable that comes with the eGPU kits. They are basically mHDMI cables that have their ends soldered to the two boards. All you need to do is buy an extra mHDMI cable, cut off its heads, split the current soldered mHDMI cable, and splice the new cable into the older one. I don't think that cable splicing should have any negative effects on the Gen2 support, because let's face it, copper is copper. All cable splicing would do is extend the total distance the signal has to travel. It shouldn't interfere with the actual compatibility itself...

  20. Thanks ! I guess if they sell the PE4L with a 200cm cable there should be no difference in terms of signal... My concern was also related to the fact the KZ-B22 ribbon slides-in, it's not soldered to the end plugs... not sure whether that has any impact on the signal.

    soldered vs not soldered should not make any difference in the actual signal strength (except if loose cable if not soldered). However, I don't know how this would effect the whole PCI Gen1 vs Gen2 compatibility. I don't think it would negatively affect Gen2, but somebody else will have to confirm that. It's being said over and over again right now that non-soldered cables do not allow Gen2 compatibility, but that is directly related to the PE4x boards, not the actual mPCI-e port in the laptop. My understanding of these mPCI-e extenders is that it simply acts as an extension cable.

    I couldn't find any other extender (why are these things so damn hard to find ?) except for the one listed on this site (scroll down to PE-MINI-FLEX) : adexelec

    Since my knowledge in electronics is rather limited (to say the least) I don't quite get that theory with the power jumpers removed & all... So I'm not even sure it's the right thing. It looks like it also uses a "slide-in" ribbon.

    From the description, I believe the power jumpers acts as a kind of toggle for using external power source or getting power from the source port itself. For example, if power jump is IN then you'll get power from the mPCI-e port itself; if power jumper is OUT then you'll need to power it up from an external source. This kind of ribbon connection is extremely common in laptops - most trackpads and keyboard connectors use this ribbon connector (loose ribbon that you clamp down into place).

  21. Hi all !

    I am about to order my eGPU setup. Since I will plug/unplug the m-pcie adapter quite often, I am worried about excessive wear of my laptop m-pcie slot. So, before ordering, I'd like to know if it's possible/recommended to use an extension cable/ribbon, something like this one (on HWTools site):

    KZ-B22

    So, is KZ-B22 the only m-pcie extender (I couldn't find anything else with google) or does someone know of another, better product ?

    Also, is it OK to use an extension like that or will it affect the GPU signal etc ?

    Thanks in advance !

    Great find, and great idea bringing this up. I was thinking of getting something similar, as the bulkiness of the mHDMI cable is currently applying a bit of flex stress to the PM3N board.

    I can't confirm with certainty that it won't add any delay to the signal, but such a short cable should not have any major effects.

    If you could get a PE4L-PM060 or a PM200a (same card, only difference is length of cable, 60cm vs 200cm) and not notice any degradation of signal to and from the GPU, then it should stand that extending the mPCIe card by something like 10cm should not negatively affect the GPU performance.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.