Jump to content

daver160

Registered User
  • Posts

    140
  • Joined

  • Last visited

  • Days Won

    1

daver160 last won the day on April 5 2013

daver160 had the most liked content!

About daver160

  • Birthday 05/11/1986

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

daver160's Achievements

T|I Advanced Member

T|I Advanced Member (4/7)

49

Reputation

  1. You didn't read the first page at all did you? Go read the first page, then so a search for a thread for a successful Elite book implementation. I seem to recall someone with a 15" managed a very successful setup.
  2. My understanding is that if you have Intel it is the primary GPU on boot up. This is basically how Optimus works. If there is an option to change the default GPU on boot up, it will be in your BIOS options.
  3. Yes it ought to work. Check that the 12V rail outputs at least 8A, and you should be fine. The worst that would happen I think is that your GPU wouldn't power up. If that's the case, you just need a better PSU
  4. It is pre-optimus, I know because I have the same laptop at work. It is a Core 2 Duo model, with "switchable graphics" (if you had the ATI GPU)
  5. If you want better performance, you should look into getting the PE4L v2.1. It allows for proper PCI 2.0 connections thus giving you better performance. Unless I missed something and you are running a non Sandy Bridge or higher CPU?
  6. No, that's good that you brought up that their testing is done only on desktops. I did t know that. It explains why their testing is so successful and ours never work!
  7. Short answer, yes. Long answer, yes, because it will give you full 1.2Opt compression so you'll get better performance from your GPU, and its also much cheaper than the PE4H 3.2. Also, assuming that BPlus is correct about compatibility, the PE4L v2.1 is supposed to support PCI 3.0 (Gen 3), so you could potentially use it on future laptops that support PCI 3.0 over ExpressCard or mPCI-E.
  8. Shouldn't be a problem. I have a GTX 650 To on 1.1Opt and I can get over 60 fps in some games at 1920x1080. For example, I get between 30 - 80 fps in Borderlands 2 with mostly high settings (30 in open areas, 70-80 when indoors). No AO or AA though. I also get 20-60 fps in Tomb Raider with med to high settings at 1920x1080 and vsync (disabling TressFX increases to about 30-60).
  9. Well there's definitely one benefit of the new PCIEMM cable over the older flat one: it's flexible so easier to manage. Sadly i cant find any buyers on local craigslist who wants a PE4H+PM3N. I'd love to upgrade to a PE4L, but am a bit arsed to fund it myself. I might just bite he bullet and buy one for my own personal use and keep the PE4H at the office.
  10. Just curious, is your PE4L v1.5? Just so we have some context. Because the PE4L v2.1 doesn't have a detachable cable If it weren't for being restricted to company funding (of which there is next to none left for this side project) I would totally order a PCIE-MM060B cable to test out with my PM3N configuration. I was already able to get this eGPU to pull double duty for me (gaming at home leisure time, CUDA crunching during work-at-hone time), but the powers that be prefer to on spend funding on a guaranteed solution. If anybody can find the same cable on Amazon, however, I'd gladly pick one up to test out and see if we can get Gen2 going!
  11. That's too bad that it didn't work for you. I wonder if anyone else has had similar results? I know that @jacobsson is waiting for theirs to arrive so I'm interested to see if he gets similar results as yours. I wouldn't mind paying $20 for a flexible cable at least, but the shipping cost is just insane when you consider that it's the same as the product itself. I'm also considering picking up the mPCIe extension board, maybe I can bundle them together.
  12. OK, well if you get more time, please run some more tests to see if it's something that can be fixed with a bit of tweaking, or not at all.
  13. What steps did you use to test this? Did you conifugre Setup 1.x or your BIOS to use Gen2? Also, did you do any benchmarks, or did you only check GPU-Z while the GPU was idle? It's not that I don't believe you, it's that you've provided no proof whatsoever that the cable fails Gen2 compatibility.
  14. You should shrink those images a little bit, and place them in spoiler tags. Or make them attachments to your post. They're huge!
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.