Jump to content

rahl

Registered User
  • Posts

    4
  • Joined

  • Last visited

Profile Information

  • Location
    Europe

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

rahl's Achievements

Curious Beginner

Curious Beginner (1/7)

0

Reputation

  1. If he's interested, do tell him... As for me, I'm very interested in your M6700/ GTX880m tales. Just overclocked the living hell out of a K3000M in a M6700 the other day. And I already know this K3000M card is not gonna cut it. Therefore, I'll be looking for an upgrade along the lines of K4100M, K5100M, M3000M, M4000M, M5000M, GTX870M, GTX880M, GTX970M or GTX980 in the near future. Might take a while till I find a good deal, as these cards tend to be as expensive as fuck, and I don't want to blow through my wallet recklessly. If you have deep technical knowledge about your GTX880M upgrade in M6700, I'd gladly listen to every bit of detail... if you're willing to share, that is.
  2. Those nVidia bastards are gonna pay for disabling my cores! Wait, I'm no Duke Nukem. Let's try that again... Thanks for the informative post, dude! Turns out the situation solved itself, because at 1090Mhz core, the 3D application crashes almost instantly; at 1070Mhz, it takes five minutes for it to get there. So I'll be running 1050Mhz core max. This kind of goes hand in hand with some other TechInferno guy, who (once upon a time) had a K3000M, too -- back then, he bragged about getting it to 1088Mhz... and I kept his screenshot, for reference. So these results make my card only slightly crappier-- but that's cool, too. 1050@1800 it is, so far. Read about some fools pushing their 680M GDDR5 to 2066 (but they probably didn't have this crappy Hynix memory). EDIT: 1050@2000 crashed, 1050@1950 showed some minor artifacts in geometry. I'm gonna go conservative and leave it at 1000@1800... at least till I get my filthy human-paws on some more recent card.
  3. You do remember right! And I'm a noob (hey, at least I'll admit that). As explained below: It turns out that there's a deviously hidden menu in nVidia Inspector of "Activate full 3D by GPU usage", which I had set to 96% and then forgot all about it (no wonders here). So I pushed the card to 1050@1750, for the fun... and it seems it holds (there's just one throttling drop in the attached screenshot, caused by by setting the aforementioned item to 80% and coming across an undemanding area of the rendered scene (I'll set the value in question ever lower then, I guess)). Now, with the supposed issue solved, what can we do here? Lemme rephrase the question to this, to have some constructive discussion going: is running the card at such clocks considered safe? Or should I worry about blowing something (the 240W PSU, the internals of the notebook, my pride, etc.) ? In other words, these precisions notebooks are supposed to handle 100W cards, and the stock K3000M is a 75W card. But then, what's the consumption like on this 60% GPU overclock? Got no watt-meter here, oops. Thanks, guys
  4. So I managed to grab one of those fine BIOSes that are supposed to unlock the potential of those Quadro cards. Flashed it, and lived to tell the tale. This is Quadro K3000M (Dell version, used in Dell Precision M6700). The defaults of 650@1400Mhz got pushed to 850@1400 here, along with implementing that sweet OC unlock feature -- not bad, not bad at all. But then, I wanted to squeeze more out the card. I've tried 1000@1750 and 1050@1400, and it did work, but the card defaulted to throttling to a lower P-state like every five seconds, or so. The attached screenshot shows the GPU under some heavy load, at 1000@1400 clocks, throttling a bit. Obviously, jumping from 1000Mhz to 325Mhz every once in a while is not wanted, as such throttling is associated with FPS drops. What can I do to do away with this throttling? Nvidia inspector? Modded drivers? Another GPU flash? Sorry guys -- I'm kind of lost here, not knowing what the next move is gonna be. Or am I pushing this little GPU a bit too far?
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.