Jump to content

angerthosenear

Moderator
  • Posts

    551
  • Joined

  • Last visited

  • Days Won

    10

Posts posted by angerthosenear

  1. tintin's testing at http://forum.techinferno.com/diy-e-gpu-projects/3094-egpu-desktop-htpc.html#post44463 confirms BPlus' advice that the PE4L 2.1b is pci-e 3.0 capable.

    Only problem is there are no pci-e 3.0 mPCIe or expresscard slots. That is were the manufacturers can come to the party by attaching a pci-e 3.0 Intel Series-7+ northbridge (IVB/Haswell) to one of those ports. Chances of that happening are low because Intel's guidelines have x1 ports attached to the pci-e 2.0 Southbridge instead. A manufacturer doing this risks getting blacklisted by Intel as they would be providing functionality outside of their intended/designed scope.

    Where is the TB port attached? That is a x2 port so that can be attached to the northbridge? Is BPlus working on getting licencing/permission to manufacture a TB adapter?

  2. Thank you :)

    I have a huge problem now though...

    I cut the cable of the PE4L and soldered it to HDMI Connectors, you know, for mobility,

    but now it doesn't work anymore.

    --

    Is this because of crosstalk? The HDMI Wire is 0,75m long with the shortened PE4L

    cable I am at approx. 90cm length.

    I tested all connection with a multimeter, no bridges, every lane arrives correctly

    resistance is at 0,8ohm on every lane.

    Please help :(

    OH!

    That doesn't work. HDMI cabling is too slow for the PCIe x1 2.0. Thus the PE4L v2.1 has soldered on connectors.

    Try using a DisplayPort connector/cable. I've been talking to @Tech Inferno Fan about this exact thing. Neither of us has tried the DP connection with the PE4L v2.1, but the speed it offers might be enough to support it.

    Give it a whirl, we would both be interested with a interchangeable DP cable.

  3. Hmm, if you can wait, I would. Seeing you are stuck with what you have graphics wise. Might as well wait for something better. I'unno. The 30% increase might not be worth it since it is 6+ months away.

    Be aware of the available space depending on the version you have. The OS + other stuffs use over 30GB of space. So might be worth getting the 128GB version. And replacing the SSD... is really really hard.

    Microsoft Surface Pro Teardown - iFixit

    Tons of glue and over 90 screws... no thanks.

  4. 2hrs

    .... More like 2 days....

    I still cannot have iGPU + eGPU compaction without the dGPU. Even with using the dGPU ignore, compaction, dGPU off. Setup 1.2 just breaks. No idea. I tried a nvidia driver update (the 320.xxx beta) DON'T DO THAT. Black screen on boot, with dGPU off, video playback becomes a game of watching squares. Had the fun of re-installing drivers.

    I ran the RE5 Benchmark though. 130 fps average if I recall correctly. TBH I kinda forgot.

    Any other thoughts on why I cannot compact with dGPU [ingore] or [off] ? Sometimes with [ignore] I cannot boot at all. Get a black screen / have to kill the system. Funny how iGPU + dGPU + eGPU is easier for me lol.

  5. Well the new Clevo P370SM3 120Hz/3D seems tempting. As I mostly work on my systems the integrated Thunderbolt port comes in handy to capture multiple 3D streams loss-less etc...

    I would combine it with 2x K6000M and and a nice XM CPU and tweak the BIOS and cooling myself.

    But as Khenglish said I really enjoy maxing my P170HM3 and hacking the sh.. out of it, I think next year will bring us the real perf. boost for CPU and GPU.

    To sum it up I feel content and no urge to get a new system this year.

    How about for the less BIOS modding inclined (read: me). How good is the stock BIOS?

    Thunderbolt would be awesome to have, especially if I could get my hands on a TH05 adapter (Thunderbolt to PCIe adapter for eGPU). The physical side of modding is appealing.

  6. I had one apart the other day. You can fit a micro ITX mobo in there. And with something like this:

    picoPSU-160-XT, 160watt (200watt peak) , 12V input DC-DC ATX Power Supply

    OpenUPS

    You can make the the desktop it always wanted to be. Just hookup some LiPo batteries (or use the giant one it already has).

    tbh, I'm actually trying to figure how to build something both with a desktop CPU and desktop GPU while still fitting the laptop category (not the Dell version of the laptop category). The only thing I cannot figure out is how to adequately power a desktop GPU off batteries. A good GPU I should say (laptop with a Titan let's go!).

  7. I've been downloading (and still am) the RE5 benchmark since I replied to your post. I forgot to note in my scores / writeup. I performed all this testing with my full hookup of monitors. Internal + external connected to laptop, and 4 monitors connected to eGPU. I'm sure this hurt the score somewhat.

    I will post results from some more testing in probably the next 2hrs.

  8. Thank you for posting these. THe 14062 3dmark06 result is with the dGPU off is way lower than the others. We can see that there the CPU result is 2582 whereas the prior tests have it > 3200. That means the pci-e compression isn't working at full performance.

    Try redoing the 3dmark06 test with dGPU off again and if necessary set the power profile of your system to High Performance. With your CPU running at > 3200 I'd expect to see a 3dmark06 of ~18k for your i7-2620M 2.7 + GTX660Ti.

    Can you also do a RE5-var-dx9-1280x800 test too with and without the dGPU on? That would give a clear indicator of real-world performance DX9 gaming performance difference with and without the pci-e compression (x1.2Opt) engaged.

    I seem to have an issue when running 36-bit compaction after setting the dGPU off. I ran the test without explicitly running compaction in Setup 1.2x. I will try one of the 32-bit compactions and try again. My computer is always set to High Performance. Yeah I thought that result was a bit odd.

    I'll give it a whirl. Is there a DX11 / PhysX benchmark that is gaming focused you would like me to test as well?

  9. Oh boy, that was a fun slew of sitting watching benchmarks run.

    For some reason I cannot get Vantage to work worth squat.

    I ran each test with stock clocks (eGPU clocks).

    I ran 3DMark06, 3DMark11, and the new 3DMark. For each, I tested with dGPU dedicated to PhysX, CPU for PhysX (with dGPU on), and dGPU off (CPU for PhysX).

    This is certainly all over the place. I'll just post results and guess what it means...

    3DMark06

    dGPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: 10476 3DMarks

    10476 - SM2.0 4597 - HDR/SM3.0 4247 - CPU 3232

    CPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: 10399 3DMarks

    10399 - SM2.0 4536 - HDR/SM3.0 4244 - CPU 3206

    dGPU off: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: 14052 3DMarks

    14052 - SM2.0 6048 - HDR/SM3.0 8146 - CPU 2580

    3DMark11

    dGPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: P5503 3DMarks

    P5503 - Graphics 6908 - Physics 3376 - Combined 3486

    CPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: P5490 3DMarks

    P5490 - Graphics 6899 - Physics 3372 - Combined 3455

    dGPU off: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: P5131 3DMarks

    P5131 - Graphics 6944 - Physics 2835 - Combined 2945

    3DMark

    dGPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231

    Ice Storm 73773 - Cloud Gate 8913 - Fire Strike 3714

    CPU PhysX: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231

    Ice Storm 73937 - Cloud Gate 8859 - Fire Strike 3725

    dGPU off: NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231

    Ice Storm 75153 - Cloud Gate 9020 - Fire Strike 3701

    -----------------

    I can only come up with the conclusion that it really matters what you will be running. Since I run a lot of DX11 and PhysX applications, it is more optimal for me to have the dGPU enabled and running as PhysX. Having the dGPU on but not to set PhysX doesn't seem very beneficial. If you run DX9 applications it is certainly in your best interest to have the dGPU disabled. It is mainly a personal preference thing.

    -----------------

    For the leaderboard, I achieved this score through some overclocking:

    NVIDIA GeForce GTX 660 Ti video card benchmark result - Intel Core i7-2620M Processor,FUJITSU FJNB231 score: P5762 3DMarks

    P5762 - Graphics 7469 - Physics 3422 - Combined 3415

  10. Guys,

    For the past few months I've powered my 660 ti with an old PSU that I bought back in 06 for my EGPU (and was working fine)

    Yesterday, a power outtage blew out my power supply. Since I have the CX 430 as a backup, I thought I'd fine.

    However, the CX430 only has 1 PCI-E power connector, the 660 TI requires two.

    I remember Nando telling me CX430 would be fine for the card, so any tips on how to get this to work?

    Thanks all.

    They make adapters that combine two 4-pin connectors to one PCI-E connector. You can use this to power your 660 Ti. You can find them pretty easy on eBay, amazon, newegg.

    Just search for : 4 pin to PCI-E adapter

  11. When I use RealTemp it's more often HOT than LOG, but it stays around 95 which is more or less borderline. Will have to do for now.

    After flashing several BIOS yesterday and playing around with my power settings, I found my Lenovo power manager at fault. It somewhat didn't like my Windows power profile. Once I set everything to performance in the Lenovo manager, my multiplier problem disappeared.

    Now I'm ready to test Gen2. Fingers crossed.

    You should check your fan and heatsink to make sure it is nice and clean. Try a cooling pad too. You normally shouldn't be seeing HOT that often. (after you see HOT it will stay at LOG) Or you can follow what I did to my laptop to try and cool the thing down.

    http://forum.techinferno.com/general-notebook-discussions/3437-thoughts-how-fix-insanely-hot-laptop-75c-idle-100-c-load.html

  12. Having the NVidia dGPU active gives less NVidia eGPU functionality and DX9 performance in an Optimus-capable configuration

    2. The NVidia control panel optional "High-performance NVidia processor" against an app, as shown below, will use the dGPU rather than eGPU for acceleration:

    Does setting my dGPU to be dedicated solely to PhysX solve this? I noticed when running FluidMark (PhysX testing):

    dGPU dedicated to PhysX - FluidMark sometimes struggles (this is a lot of particles moving at once), but I can have the effects on without issue (since this is passed onto the eGPU

    dGPU not dedicated to PhysX - PhysX is processed on the eGPU (and so are the effects), noticeably smoother calculations (since the GTX 660 Ti is far better than the NVS 4200M)

    FluidMark indicates what GPU is doing what.

    I will try 3DMark09 and 3DMarkVantage and see what happens.

    --

    I have the dGPU demoted to 32-bit space while the iGPU and eGPU are in 36-bit space, I'm not sure if this helps with running stuff in DX9/10. The games I have been playing are DX11 anyways, so I'm not too worried about it.

    --

    Since the dGPU is dedicated to just PhysX, would the Opt link still be working between the iGPU and eGPU? I can see that it is a 1.2x link in GPU-Z, where would I check for Optimus?

  13. My computer runs hotter when I am using my CPU (probably due to CPU compaction and w/e). You will have random CPU multiplier drops even though you don't quite hit the thermal limit (may hit the thermal limit for a second). Run Real|Temp on another screen (if you have multiple) and see if the 'OK' changes to 'LOG' while gaming. If you don't have another screen, just have it running while gaming.

    I have no idea why you are so hesitant for establishing a Gen2 link, I had essentially all my issues resolved by establishing a Gen2 link (moving from Gen1). It helps immensely, so just do it.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.