Jump to content

D2ultima

Registered User
  • Posts

    284
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by D2ultima

  1. - If safe mode works and normal mode does not, it heavily suggest that some driver or something that is loading with windows that does not normally load is causing a problem. I suggest going into safe mode and doing a "clean boot"; disabling everything except microsoft services and your wifi. See if you get into windows proper. All I know is if safe mode works, normal mode should work. - Good, though I think it's your windows install. - That's a pretty unoptimized program then. However you can just probably use Process Lasso and limit it to 4 threads alone without killing HT permanently. - Backup your stuff and install fresh normally. - Good. Don't let companies do this kind of crap. It's one thing if they break warranty because of Prema mod, but breaking your PC and then leaving it alone is shit. - I doubt it.
  2. No, I was making a general statement. If you're replacing 1.5mm height pads, you should use 1.5mm of height with it.
  3. You should be able to use 1mm and 0.5mm together, but 1.5mm would likely be best.
  4. Yes, because you didn't read the rules.
  5. Yes, you need the latest Prema bios or you won't be able to use the 980M. - Try uninstalling all GPU drivers and then booting into windows normally. It should work just like in safe mode. If it doesn't, Windows is probably your problem. - Try using a second monitor as well. - What in the world do you do where hyperthreading is COUNTER-PRODUCTIVE? I cannot imagine any situation as such. - Have you attempted reinstalling windows? - You should take that company to court for causing damage to your system and refusing to fix it. Whether or not they axed your warranty doesn't matter; they broke your system and then left it. - Your 780M might have problems. The fact that you didn't notice you couldn't use your dGPU means it might have been dead for ages and you didn't realize. - There is no .inf issue because 780Ms were what sold with the machine. You aren't using an upgraded GPU so it doesn't matter.
  6. I've never touched a Pascal card or an unlocked Skylake, but I want to help. I wonder if I can do eet xD
  7. Oh, welcome to Optimus, where enthusiasts retch and average joes claim "no problems here!" every day. Try manually adding the .exe to NCP and forcing it individually to use your dGPU. Should work. I find it funny, I had to do a similar thing for GPU-Z with SLI. Those fools want the PCI/e render test to be fullscreen so SLI/CrossfireX can work, except that SLI works in windowed mode, and fullscreening it didn't automagically make SLI work, so I had to add GPU-Z to NCP and force SLI on it... but then the test uses SLI in windowed mode >_<. Really, you'd think people would get their act together when they're such prominent program vendors or websites in the tech industry...
  8. As @iunlock; he's taken a decent few photos and doesn't seem to be the type that would feel like it's a bother to do so.
  9. Said rumor was using Blender which was using GPU acceleration from the APU on it, is what I heard. So far it looks like AMD is playing the same game nVidia and Intel play, and not well enough. I don't like that. If AMD is going to be the same kind of garbage company, then even more reason to simply buy the stronger parts.
  10. Not really. It's the same deal, more or less. Didn't the other user say he put it at 1.235v or something and it sits around 1.267 to 1.295 depending on what's done? That's how the desktops act too. The skylake chip does what it wants, you can kind of just influence it a little bit. If you overvolt a lot, it'll probably stick. But since the point is to use as little voltage as possible here, the lower you try to set it, the more variance should show up.
  11. I was including desktops in my statement of "I've never seen a skylake chip lock voltage".
  12. I've never seen someone manage to lock voltage on a skylake chip. Manual/Static voltage doesn't work. The chips do what they want, though Manual voltage and offsets do have some kind of effect.
  13. Beam your knowledge of these things into my brain please =D
  14. This... is exactly my point. IPS panels offer better contrast, deeper black levels, etc. The colours are richer to the end user, but the actual range of colours is generally the same (though, please note that the range is not equal... 72% NTSC on one panel is not 72% NTSC on another panel, though both do cover 72% of the colours of the NTSC range). It's why I've said it the entire time: 25ms response time, 60Hz (overclockable or not), 72% NTSC 6-bit colour panels becoming the only panel option for the high end notebooks is an absolute fucking joke. I mean, 120Hz 5ms 94% NTSC which is the new standard for the clevo 17" models (soon anyway; only P870DMx for now) is a really big step up, especially since it's IPS over TN but better specs than previous TN panels like mine (72% NTSC, 120Hz, 5ms, 6-bit, TN)... but I'm still waiting on a proper review for it (as MSI's "5ms 120Hz" turned out to be 25ms GTG). If the panel's specs turn out to be true, then great. Otherwise... the notebook IPS panels are all a joke. Well the lid for the P870DM and P870DM2 is the same, but houses all 3 panels. So anything housing one of the panels should house all 3, is my judgement. But yes, we'll see. Clevo Extreme Gaming has stated that the P670RP6/P670RS should get the 120Hz as an official configuration around december or early next year as well, which stands in line with the lid housing the LP173WF4 panels (and thus being compatible with the 120Hz). I don't think it's consistency. I think it's just better contrast and deeper blacks, which make the colours look better. See the colour range explanation above. As for the panels being 8-bit or not, have you seen any new panels that are 8-bit colour on the laptops? Again I only know of the AUO 4K panel. The Gsync panels from the maxwell gen except that 4K AUO are all 6-bit.
  15. Where do you see 8-bit IPS laptop panels? The only one I've seen selling in the mainstream is that AUO 4K panel (the gsync one). But see this is the whole point about it all. LP173WF4 (1080p 60Hz IPS gsync-certed 17" panel) = x mounting orientation, x dimensions, x cord length, etc. AUO 4K panel whose name I no remember = fits in LP173WF4 LCD covers and laptops, plug & play = x mounting orientation, x dimensions, x cord length, etc (or close enough to it). AUO B173HAN01.2 (1080p 120Hz AHVA Gsync-certed 17" panel) = fits in AUO 4K panel covers and laptops, plug & play = x mounting orientation, x dimensions, x cord length, etc (or close enough to it). Therefore, if I skip the 4K panel... like dropping a known variable in a mathematical equation? LP173WF4 size/mounting orientation/cord length/etc == B173HAN01.2 size/mounting orientation/cord length/etc If they were truly different, that would be beyond surprising to me. It would mean that one of the existing panels was modified to fit in one of the existing laptops, possibly by its cord length. Which I highly doubt.
  16. Don't know the specs on the 1440p screen, can't say. Different manufacturers. Tech is the same, basically.
  17. I see. Interesting. I'm still sure it should work, though... the LP173WF4-SPx1 is the 1080p 60Hz Gsync panel for 17", and those covers hold the same AUO 4K 100% Adobe RGB panel that the P870DM held, which the P870DM3 holds, which also holds the AUO B173HAN01.2 So I basically do it like this: 1080p 60Hz = can hold 4K 4K = can hold 1080p 120Hz 1080p 60Hz == 1080p 120hz I wish I had someone willing to actually test. But great news that the P775DM2/3 is getting it as an official config =D
  18. Hey @Prema do you think you could get the 120Hz AUO panel into a P670Rx or P775DM3 with Gsync enabled using one of your BIOS mods? Assuming a gsync GPU/license is purchased, of course. I sincerely believe that it'd benefit those machines and I'm wondering why it wasn't Clevo's new standard, honestly.
  19. Hmm... maybe it does. Maybe the lack of voltage only happens under LN2 temps. But okay. That's too bad though. Question, does it happen with memory at stock as well?
  20. Mmm... you needing more voltage goes against what I know about Maxwell, though. Think you can try getting the card down to the mid 40s? Kill the CPU overclock (or lower it to a lesser number just for the testing), turn off the machine, remove the back cover, shove a 15°c A/C vent into the heatsink portion for about 15 minutes until it's COLD, boot it, instantly apply the OC and run the benchmark with maximum fans? See if you manage to get the card rolling with the lesser voltage?
  21. What is this thing you say called "sleep"? Jokes aside though, I am indeed currently unemployed, and thus I do have quite a bit of time. Edit: Truthfully, the SINGLE reason I do not overclock or bench much is because it's simply too hot and humid and has extremely little airflow in my room. It's on average 90F per day and if it drops to 77F by midnight, I consider that very cool. It also has very little airflow, so when the area around me gets hot, it generally remains that way for a long time. Mythlogic was able to tell in an instant that I was in an extremely hot and humid environment the first time they got my notebook; simply six months after I had it. Anything beyond the 1006/6000 above is instantly non-useful for pretty much anything that isn't firestrike, just due to temperatures, unless it's a particularly cool night and I run benches at around 4am when it's about at its coldest. If I get access to a cooler room in the day, preferably one with an A/C, I would pretty much keep daily driver overclocks in general. I've used some A/C rooms before, even large ones at university, and I was able to hold LONG periods of overclocking in even Crysis 3 without thermal throttling or crashing.
  22. Why game less? Why not overclock and bench more AND continue gaming as much? =D Besides, gaming has a lot of even weirder questions too. Also, anybody has any idea why the combined test for firestrike only scales 50% on SLI? At least for me. Single GPU was generally 11-14fps, and multi-GPU was 17-20fps. It's basically 50% boosting. But the graphics tests were over 95% scaling?
  23. Eh but the minute I notice something being weird, I INSTANTLY want to find out why. How do you think I've learned so much already? xD
  24. Yes it is, but while the CPU does the calculations, it still has to feed to each GPU to display. I.E. each GPU ought to have a VERY slight load during the Physics test, and just feeding the extra GPU should theoretically take away an ever so slight amount of compute power from the physics test. But this doesn't happen for Maxwell/Pascal and it makes no sense to me as to why.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.