Jump to content

sangemaru

Registered User
  • Posts

    157
  • Joined

  • Last visited

  • Days Won

    1

sangemaru last won the day on September 27 2012

sangemaru had the most liked content!

About sangemaru

  • Birthday 09/21/1987

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

sangemaru's Achievements

T|I Citizen

T|I Citizen (5/7)

51

Reputation

  1. Thanks @Tech Inferno Fan. I'm kind of worried about getting a PE4C 3.0 or PE4L 2.1b since they're quite expensive to get in europe. Would definitely do it if i can find a purchaser for my EXP GDC though. A large part of the problem is the expresscard slot on this laptop, like I said, the adapter itself worked perfect with all GPU's when using mPCIe. By the way nando, how did you get ASPM settings to stick through sleeps/reboots? Currently I have to manually enable it in RW-Everything every time i sleep/reset the machine (it doesn't take very long, but it's still a bother). Is there a way to set these before windows loads without breaking the bootloader or current installation, and have them stick? EDIT: my god, prices of Bplus adapters have gone up massively - to the point where i doubt there's any sense in going for EC as opposed to a thunderbolt solution. And nobody seems to be selling their Bplus adapters.
  2. Two updates: Using EXP GDC, i tried out an RX460 and GTX 1060 GPU. As opposed to previous models I'd tried, RX 460 is capable of sustaining x1 2.0 and working just fine indefinitely. GTX 1060 is capable of sustaning x1 2.0, but occasionally crashes, making it unsuitable for use with the EXP GDC. It seems the choice of videocard matters a lot. One other glorious update: I used RW-EVERYTHING to enable ASPM across all my PCIE devices on all my root hubs. Windows 10 now reports around 8 hours of possible available battery life (as opposed to 4-5) with Opera (20 tabs open) + Chrome (40 tabs open), Wi-Fi enabled, brightness set to min+3%, CPU set to 99%max, battery saver on. CPU Package power consumption has dropped to 3.4W min instead of 5.7W min. BatteryBar still reports only 3:40 of available time though - i'll be testing a discharge cycle to see how long the machine can really run. If I just managed to increase my laptop battery life from 3-4 hours effective to 7+, then no way in hell I'm giving this qt3.14-laptop up. Just need to find the right stable eGPU now. I wonder if RX470/480 could sustain x1 2.0 stable. Oh, and this is while I'm writing/browsing. Idle is 10-11+ hours on the 9-cell battery (with 10% wear).
  3. Yea man, that's ok. 0x1 and 0x01 is the same thing in 8-bit, apparently. 0x170 at 1f would glitch out for me, but that was with a hotter CPU, maybe I'll try again. Set 0x171 to something like 0x25 and if it's stable, go down in increments of 5. Ignore the errors, those pop up for everybody. You tried the Dell feature enhancement pack? http://www.dell.com/support/home/us/en/19/Drivers/DriversDetails?driverId=MHVWP&driverId=MHVWP&CID=281125&LID=5567365&DGC=AF&DGSeg=ARB&ACD=25789221108616845 Or this: http://forum.osxlatitude.com/index.php?/topic/5907-manually-controlling-the-cooling-fan-of-a-dell-laptop-or-pc/
  4. 0x16F set to 0x1 Make sure to keep in mind the range posted in badbadbad's guide: GT OverClocking Frequency 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] [the decimal value] x 50MHz (Example: 34 x 50mhz = 1700mhz) GT Overclocking Voltage 0x170 0x00-0xFF (8-bit value from 0-255) [iGPU] 0.01 increment for every value from 0x00 to 0xFF (Ex: 0x05 = +0.05V) So if you'd like to try a relatively safe clock, try setup_var 0x170 0x1c which should set to 1400Mhz. Setting it to 0x17f most likely registers it as 0x7f which is 127 in decimal, which is way out of bounds. As for voltage, you gave it +0.37v, which should be enough to get you well over 1500MHz. Check the table below: GT Overclocking Frequency Value Resulting Frequency (from GPU-z log) GT Overclocking Voltage Value Voltage Increment (speculation) Memory Frequency (Dual Channel) Highest GPU Temperature (from GPU-z log) Highest GPU Power (from GPU-z log) Furmark 720p Bencmark unchanged 1100 MHz unchanged +0.00 V 1600 MHz 18.1 W 371 unchanged 1250 MHz unchanged +0.00 V 2133 MHz 17.9 W 515 0x1a 1300 MHz unchanged +0.00 V 2133 MHz 19.5 W 517 0x1b 1350 MHz unchanged +0.00 V 2133 MHz 21.1 W 532 0x1c 1400 MHz unchanged +0.00 V 2133 MHz 81 C 22.2 W 549 0x1d 1450 MHz 0x05 +0.05 V 2133 MHz 83 C 23.5 W 553 0x1e 1500 MHz 0x15 +0.21 V 2133 MHz 84 C 27.5 W 563 0x1f 1550 MHz 0x25 +0.37 V 2133 MHz 87 C 31.6 W 589 0x20 1600 MHz 0x40 +0.65 V 2133 MHz 93 C 37.4 W 639 0x21 1650 MHz 0x50 +0.80 V 2133 MHz 102 C 717 Anyway, use throttlestop's TPL option to enable Intel Power Balance and give 0 to the CPU and 31 to GPU in order to test overclocks. It's going to get pretty hot pretty fast.
  5. I disagree. Used iGPU only version with the 3740QM decently cooled with liquid metal and the best pastes. It quickly ramps up to 105C and then hard-throttles down to about 3.2GHz, and slows down massively in games. Switching to a 3612qm made games much smoother.
  6. Tried switching out the newer EXP GDC for the older EXP GDC using the mPCIe cable, and it seems the old Beast doesn't even throttle down to 1.1, while the newer one does. So the older one actually has better signal integrity - just not over expresscard And i don't know if it's the cable or the expressCard slot. @Dewos, didn't you have this same exact problem?
  7. So I finally have an update on my machine - got rid of the 3740QM, running too hot, replaced it with a 3612QM and everything is much smoother overall. I ordered another EXP GDC Beast (v8.0 this time). The mPCIe version was 33$ and it convinced me to pull the plug on it - unfortunately it confirmed my hunch, and it's very disappointing. This new adapter can't hold Gen2 signal either.... on expressCard. But on mPCIe it can. It ocasionally throttles down to 1.1 for a microsecond and then ramps back up. Gonna game a bit on it but... very disappointing Considering the ExpressCard slot seems unstable... forking over the 100$ for a PE4C v3.0 seems like a much harder swallow.
  8. Yeah, the CPU is a trooper. It has no problems maxing out it's potential with unlocked multipliers, though iGPU overclocking gets rather unstable for me past 1400MHz, though I can even game @ 1600, for example. My only glass ceiling is cooling. At some point I might do some hardware mods for that. We'll see.
  9. No, I had no troubles with Gen2 enabled so long as I wasn't putting any load on the eGPU. My problems appeared the moment I tried to connect anything to it. If I booted the machine with the adapter connected but with no monitor, Windows would be perfectly stable and report Gen2 speeds.
  10. My external monitor is an old Fujits B24W ECO. Unfortunately at present I have no other workable machine with EC on-hand (I do have an e4300 but it needs repairs before i can make it work - maybe I could report on it sometime next week). I could test with an HD4870 or HD3870 eGPU, but once more, next week at the earliest. It would be really sucky if the adapter could actually deliver Gen2 but the problem were with the machines .
  11. For me, I set tolud at 2.5GB, I have iGPU set to always enabled, as well as forced primary. I also force PCIe speed to Gen1. I suggest you try it even if the 2570p can hold a link, who knows. That's about it.
  12. Have you played around with efi variables to lower tolud and/or enable disable igpu/dgpu or switch primary device?
  13. If you don't have access to an unlocked BIOS, you're unlikely to be able to change the PCIe adapter speed using anything other than Setup 1.30, as far as I'm aware. In my case, I'd get BSOD at Gen2 speed if the adapter wasn't the main GPU. If it WAS the main GPU, I wouldn't get BSOD, but I'd either get blackscreen, freezes, or both (usually both). Especially since you're using nVidia Maxwell cards, your system TOLUD must have enough space be able to allocate resources for all your components, moreso since you also seem to have dGPU's in your machines. Refer to this thread for more information. Without enough contiguous address space available, you simply can't use the cards, that's all there is to it. Some machines (like my own), offer dynamic TOLUD allocation or the ability to manually set the TOLUD size. Most machines do not. In the near-certain event where you don't have enough space available, you MUST perform a DSDT override. The second question has to do with signal integrity, more specifically the inability of the EXP GDC to reliably deliver Gen2 signal quality. My own machine would automatically set the PCIe speed to Gen2 when the adapter was not set as primary from boot, or on hot-plug, which would instantly crash and freeze, since my own adapter can't reliably deliver Gen2 signal. I expect this is a common issue on most machines. I use EFI overrides to force Gen1 signal, while keeping the iGPU enabled while the eGPU is set as the primary display on boot. This allows me to boot and use the card properly, maintains the bus signal fixed at Gen1 and then allows for hot-plugging. It took me days to get it working, and I was very fortunate to have the machine I do and receive the support of the techinferno community. First, you must confirm you have enough PCI Address Space available (check the first link). Once you confirm that, use GPU-z and try to determine what bus speed your eGPU is using when connected to an external display. This might be more easily done if you uninstall all drivers and allow just the Microsoft VGA Adapter driver. If the link speed is reported as PCI Express x1 @ 2.0, your machine is forcing Gen2 and it's likely that your adapter can't handle the signal. You would either need to use Setup 1.30 to downgrade the bus speed, or (better off) you should request a refund from the vendor due to the adapter's inability to sustain Gen2 as advertised, and purchase the superior PE4C v3.0 adapter.
  14. Two questions: - Have you guys confirmed your TOLUD is low enough for these GPU's to fit in the PCI address space on your machines? - Have you attempted to limit PCIe Speed to Gen1? The BSOD especially makes me think your adapters can't sustain Gen2 speeds. This is what happened to me. I can only use my EXP GDC in Gen1 mode, which is why I requested a refund from Banggood. They awarded my refund and let me keep the adapter. The performance hit in my case (r9-270x) is about 20-30%, but still good enough to max out The Witcher 3 at 1920x1200 on all ultra settings (excepting nVidia Hairworks and SSAO/HBAO) at 30fps.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.