Jump to content

octiceps

Registered User
  • Posts

    1016
  • Joined

  • Last visited

  • Days Won

    6

Everything posted by octiceps

  1. Yeah unfortunately GoW4 is a W10 Store exclusive, unless they pull a Quantum Break after release.
  2. Big if true. Should spread the word.
  3. GeCube? Now that's a name I haven't heard in a long, long time. I remember they used to be an AIB partner for ATi (yes, ATi) Radeon desktop cards. I thought they were defunct for years. Edit: That Taobao link still shows 2000 yuan (300 USD) for the RX 480, at least for me. Compared to the 8GB desktop card, might be the most reasonably priced MXM I've ever seen LOL. MXM card has slower memory though (7GHz like DT RX 480 4GB).
  4. How do you figure? RX 480 also uses GDDR5.
  5. Tutorial for the microcode hack is here: That thermal paste sounds like a Shin-Etsu product, going by the number. I've heard good things about Shin-Etsu in the past, but I'm using CLU now. Congrats on your upgrade.
  6. Glued on? Well that's interesting. Not sure how effective the heat transfer would be compared to soldering the heat pipe. Yeah it's possible to overclock the 4720HQ beyond +200 MHz, but it's a bit of a complicated process which involves editing the BIOS yourself to run a bugged older CPU microcode version that leaves multipliers unlocked. There's a tutorial on this a few threads down. If you've got your TDP and current limits set properly in BIOS, you should be able to stay at 3.6x4 in CPU-only load as long as you're not overheating. Gaming is a different story, as like you said Turbo Boost is disabled under GPU load and CPU drops down to 2.6 GHz base. This can be improved slightly by undervolting CPU, but the fix is in EC. The 180W adapter is fine for stock operation and moderate GPU OC. It can supply more than 180W to the laptop as I've pulled 220W from the wall (>190W to laptop after PSU efficiency) with it during overclocked benchmarking. Currently I'm using the 200W adapter from the P6xxRG, but the 230W Delta ADP-230EB T from the P6xxRS and ASUS ROG G750/G751 also works.
  7. It's a small upgrade for single 900M systems. But for 900M SLI, it's debatable as 1060 can't use traditional driver SLI, only DX12 MDA/LDA Explicit modes if they don't require the bridge in that particular game. So 1060 SLI would function exactly like single 1060 in the benchmarks that are popular here, unless your idea of a benchmark happens to be Ashes of the Singularity.
  8. Yeah I noticed the cutout on the copper for the capacitors. Also looks like an official part made at Foxconn. Thanks for the link. Looks cool (pun intended ). Although I wonder how the shared heatpipe works. Did you simply weld it between the CPU and GPU heatsinks? Definitely might be interested in this cooling upgrade for my P650SG if I can get my EC modded to remove CPU throttling. Then I'll slap some CLU on it, use the microcode trick to unlock the multipliers, and hopefully OC to 4+ GHz.
  9. Very interesting. What was the Taobao link you purchased from? Would be useful for future reference.
  10. Hey @ghoul, have you noticed the overall system power limit? It seems the board is limited to a max of ~140W, pulling 155W-165W from the wall depending on PSU efficiency. Any more than that, and it will drain battery. Try running TS Bench 1024M and Heaven/Valley at the same time with your CPU and GPU's overclocked, you'll probably see it.
  11. The MBP also has a lot less hardware under the hood, so pick your poison. I mean, there's a reason Clevo DTR's since 2014 have no ODD, right?
  12. True, that's what I assumed.
  13. The reviews on NotebookCheck and Gaming Laptops Junky mentioned it, on P6xxRG/RE as well. Dunno about end-users but wouldn't surprise me if they didn't pay attention or accepted the convenient "all HQ processors throttle" which you know isn't necessarily true (silicon itself is fine, throttling is firmware induced). Anyway I was just curious because not as many people had access to your mod during P6xxRG/RE gen since that unfortunate incident, so IDK if you fixed it, and the problem is still present on current P6xxRP6/P6xxRS. Edit: Pure speculation on my part but maybe because Skylake uses less power and undervolts better than Haswell, so it was less noticeable? Although if you got a bad bin or are overclocking it still throttles.
  14. Are you sure about that? Your P6xxSG/SE mod didn't fix it.
  15. I know, that's what I meant by "publicly available" as opposed to going through one of his partner shops. I bought my Clevo from Eurocom which is a Prema partner.
  16. If he made one, he never released it publicly after the incident AKAIK. If you have it, you can test by monitoring CPU frequency when both CPU and GPU are loaded.
  17. Check his other recent videos. How well it works depends on the game. Crysis 3? Great, but we knew this years ago. BF4? Not so great, but still not bad. GTA V? Awful. Anyway I already gave my opinion when I downvoted that video, so there's that. P.S. I'm not disagreeing about whether 4-way works or not, just its practicality and effectiveness. Nvidia is king of artificial limitations so it doesn't surprise me at all that end-users have found workarounds since this is what we've always done. You know that on a 4-way setup, you can force a game to run in n-way mode just by changing one bit in the SLI profile?
  18. The microstutter was apparent throughout the video, and the frame time spikes could be verified on the OSD (despite not being accurate since only FCAT can accurately measure FPS and MSPF in mGPU), so clearly the game was running sub-optimally. Anyway forget all the statistics that you benchmarkers go gaga over. Stop looking at the OSD and just watch the gameplay. It doesn't even pass the eyeball test for smoothness.
  19. 4-way "works", but it's a terrible experience just watching the video, and I'm not even playing the game. FPS and GPU usage drops all over the place, hitching, massive microstutter whenever the camera is turned, frame time spikes into the hundreds of ms range, etc. 99th percentile frame rate during that run was probably under 20 FPS. This is actually perfect evidence of why SLI is for number-chasers, not for a smooth gaming experience. I'd love to see Digital Foundry get their hands on a Titan XP 4-way and FCAT it so they can tear that guy to shreds.
  20. Whatever it doesn't matter. I logged into my Google account just so I could downvote that video. Couldn't stand the way he prattled on despite on-screen evidence showing the game running like shit.
  21. Err the VRAM usage reporting is bugged. It says >4TB in that video. Been a bug in Nvidia driver for the last several releases.
  22. There is smooth. And then there is that. BTW that's not 8K. That's 4K with 4x DSR.
  23. In benchmarks, sure. But a lot of recent games like Doom and The Witcher 3 are quite PCIe bandwidth heavy (even assuming HB Bridge) at native 4K with temporal AA, so you need PCIe 3.0 x16 per card for positive/optimal scaling. Which means only HEDT platform with 40-lane CPU would suffice, and rules out anything above 2-way SLI.
  24. Sorry to burst your bubble Mr. Fox, but PhysX is dead as a doorknob since current-gen consoles and their successors are all AMD silicon. Your only hope for PhysX is GameWorks titles, and we all know what the general consensus on that is, despite the fact that AMD's equivalent--GPUOpen--is as bad if not worse (case in point: Deus Ex: Mankind Divided). Edit: And you certainly don't need a very powerful GPU as a dedicated PhysX PPU. Even a lowly 750 Ti is all you need for a 1080:
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.