Jump to content
svl7

NVIDIA Maxwell VBIOS mods - 900m series overclocking versions

Recommended Posts

As guideline:

With the 970m mod you can expect about 10.1k GPU score in 3dM11 when running the modified vbios at stock, and about 13.5k for the 980m.

  • Thumbs Up 3

Share this post


Link to post
Share on other sites

Right, I'll add 3gb and 4gb versions when I get the chance. Also an Asus version. Should be up during the next couple of days.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
As guideline:

With the 970m mod you can expect about 10.1k GPU score in 3dM11 when running the modified vbios at stock, and about 13.5k for the 980m.

Gotcha, tnx! Also, you meant 3dmark13, right?)

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

No, I mean 3dMark11. There's no 3dMark13 anyways :P

Share this post


Link to post
Share on other sites

Awesome! I can't wait to try this tonight. Should the nvidia drivers be removed or is it okay to have them installed, just disable the gpu in device manager when flashing?

Share this post


Link to post
Share on other sites

you are the best! by the way.. are you sure it is 3dmark 11? because i already got the 1.3k graphic score from 3d mark 11 test... not even flashing bios yet.

may be you mean the Fire Strike?

Share this post


Link to post
Share on other sites
No, I mean 3dMark11. There's no 3dMark13 anyways :P

Hehe) That's how some people call Fire Strike - 3dmark13!

I thought you were talking about overclocked results for the GPU, cause i saw Prema 970M run and he hit 10.1k GPU score in Fire strike!

Share this post


Link to post
Share on other sites

Just disable the gpu or gpus in device manager. Start the flashing process. then re enable gpu or gpus. Then restart machine.

Share this post


Link to post
Share on other sites
you are the best! by the way.. are you sure it is 3dmark 11? because i already got the 1.3k graphic score from 3d mark 11 test... not even flashing bios yet.

may be you mean the Fire Strike?

That was just a guideline for people to see whether the card is performing as it should. The vbios is not overclocked by default since every system and card can behave differently.

I've seen people posting benchmarks with a 980m having 10k GPU score in 3dM11 and wondering whether everything's okay - well, in that case it's not. That's why I put those numbers there, for people who aren't that familiar with how it should perform and don't know what they can expect.

With overclocking you can get 16k+ GPU score in 3dM11 with the 980m and 14k+ with the 970m.

  • Thumbs Up 3

Share this post


Link to post
Share on other sites

I followed the procedure.

and did the following (under command prompt - admin)

-Disabled 970m under device manager

- command prompt with administrative priviledges

- Went to Mnvflash and input the following command:

ex: c:\mnvflash -6 bios.rom

and i get this error

"ERROR: FIrmware image filename must have a valid extension. (.rom .nvr. efr. ef)

I've got the vbios version Clevo 970M vbios version 84.04.22.00.13. used with MSI GT60-16F3

====

Update :

Got to flashed it.. the file name is too long so the solution is just to put the name like this . ex: c:\mnvflash -6 Nvidia~1.rom

then follow the procedure.

Share this post


Link to post
Share on other sites

Eh guys.. what's the safest setting to oc the core and mem clock and as well as adjusting the power limit for 970m? tnx

Share this post


Link to post
Share on other sites
Eh guys.. what's the safest setting to oc the core and mem clock and as well as adjusting the power limit for 970m? tnx

For 24/7 i'd recommend any highest overclock you can achieve with no voltage bump. For benching - there're no limits! Just make sure your temps are in safe range, not more than 85C.

Share this post


Link to post
Share on other sites

So far this is the stable bench in 3dmark firestrike

post-7316-14494999032584_thumb.jpg

core clock = 175

mem clock = 600

Do you think getting an extreme mobile proc will help increase the benchmark? By how many pts you think? Thanks

Share this post


Link to post
Share on other sites

Thank you to both of you for all of your hard work and dedication to this community. Where is the tip jar?

What exact does the power limit slider effect? I understand the voltage offsets, but not sure exactly what the power limit does and how its effected by the limit I impose. I assume 100% is stock.

Share this post


Link to post
Share on other sites
Thank you to both of you for all of your hard work and dedication to this community. Where is the tip jar?

What exact does the power limit slider effect? I understand the voltage offsets, but not sure exactly what the power limit does and how its effected by the limit I impose. I assume 100% is stock.

The power slider will determine the maximum power that the GPU can draw in Watts. If you overclock then you'll want to increase the power slider beyond 100%, otherwise the GPU will throttle it's clocks when it hits the 100% limit. If I had such a GPU, and my temperatures were good, then I'd just put the power slider to max to avoid any power related throttling.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

it seems that I've hit the limit for my 970m

175 = Core clock

600 = Mem clock

I wont touch the Power limit yet (default settings) unless you guys know the safe limits to move the slider for this one.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Similar Content

    • By Sparkhunter
      Hi there guys,
       
      I've been looking into getting a new GPU for my laptop (4-lane PCI-e) and enclosure (HP Omen Enclosure) but I cannot decided between the RX 5700 XT and the RTX 2070 (non SUPER).
       
      I did some price to performance calculations (higher is better) with some of the best prices I can currently get in the UK.
       

       
      The RX 5700 XT does perform better and is cheaper than the RTX 2070 (Although the 2070 does come with 2 games) but my main concern is drivers and from what I've heard NVIDIA cards work a lot better with eGPUs than AMD.
       
      If anyone has experience using both red & green or if anyone else has any other input i'd love to hear your experiences and input.
       
      Thanks.
       
    • By Solaire
      Hello everyone,
      some time by now I flashed a custom Vbios (thank you Klem) on an Acer Predator 15 G9-591 and I started doing some tests about how far I could go overclocking the gtx 970m without running too hot. With the voltage up to 1150mV and something like 300 additional MHz for the core clock, in the strongest stress condition for both Gpu and Cpu ( i7-6700hq undervolted), the Gpu reaches a max temp of 85C (with some 86 spikes but never more than that).
      Considering that soon I'm going to relid both Cpu and Gpu and clean everything in the internals (laptop is 4 years old) I think I'm going to have some additional heatroom.
      I've already tested some further overclocking and I noticed that even if the temperatures remains under 93C (which is just for testing purposes, but after relidding temps could be nicer) graphical glitches occur and after some time most games crash. But my question is, could it be because of lack of power supply? The laptop charger provides 180W 
      Could there be an increase in overclock margins with a 230W psu or something like that? (Obviously with the same 19.5V output)
      If anybody tried something like that on any laptop model, or knows the matter, I'd like to know
    • By ounces
      After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

      It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

      GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
      Banding was not present with the original TN panel using the same GPU.
       
      While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
       

       
      My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

      ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
       
      Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
       
      Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
       
      Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

       
      So far, we have two strings for BIOS version:
      80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
      Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
       
      ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

       
      Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

       
      "NVidia inspector" is of the same opinion:

       
      Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

       
      Here we have two three interesting findings:
      Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
      Loading the same file in "Kepler BIOS tweaker" confirms the facts:

       
      EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
      ...Shall I be worried about "60.02Hz" refresh rate?
       
      To summarize:
      Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
      Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
      K4000M.zip
    • By Blocker35
      Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
       
      - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
      - Razer X Core 
      - Apple TB3 to TB2 Adapter
      -  TB2 Cable (Cable Matters)
      - 23inch AOC External Monitor
       
      I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
      I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
      Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
    • By IAmSteveRogers
      I have been attempting to change my boost clocks values for the longest time, though I've been unsuccessful. I managed to flash my modded BIOS, though no changes seem to have had happened, along with no errors either. Here's all the stuff I used and was able to get. If someone is able to help me with this, it would be greatly appreciated.
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.