Jump to content

[Mod] Voltage increase Nvidia GT 555M


Recommended Posts

You mean even with the stock BIOS? Highly unlikely. While it is totally possible for the driver to override the vbios settings, I really doubt this is the case. It's rather getting read out in a wrong way.

But the only easy way to find out what's going on is to overclock the card and see whether the upper limit of the core clock is way beyond the one it was with the previous driver.

Link to comment
Share on other sites

You mean even with the stock BIOS? Highly unlikely. While it is totally possible for the driver to override the vbios settings, I really doubt this is the case. It's rather getting read out in a wrong way. But the only easy way to find out what's going on is to overclock the card and see whether the upper limit of the core clock is way beyond the one it was with the previous driver.
I know....havent tried with stock vb, but upper limit increased.. was 800 stable .. now 840. Was not able to get any higher before without artifacting or crash. This is reported in evga precision as well as GPU observer as 0.93 even :)
Link to comment
Share on other sites

Well, if you already used the modified BIOS then your voltage didn't increase, in the modified BIOS it's set to the max possible value, and that's hardwired, the driver can't override the limits of the hardware.

If you can clock higher than previously then it's due to driver improvements, that's totally possible, though 40Mhz is a lot.

Would be interesting to see whether other people can observe a similar behavior with these drivers.

Link to comment
Share on other sites

Well, if you already used the modified BIOS then your voltage didn't increase, in the modified BIOS it's set to the max possible value, and that's hardwired, the driver can't override the limits of the hardware.

If you can clock higher than previously then it's due to driver improvements, that's totally possible, though 40Mhz is a lot.

Would be interesting to see whether other people can observe a similar behavior with these drivers.

Nvidia has access to, and can change any behavioral parameter inclu +-v tables if they so choose in design, perhaps a hidden" 0.9125 " table or say a 1.00V=0.00V_% option that can be driver accessed and set to say 0.93..... specifically for cuda programers ???? Remember....laptop chips are undervolted, under clocked desktop chips.......all out of same barrel, just put into one of the two form-factors and adjusted accordingly.:50_002:

Link to comment
Share on other sites

  • Founder
Nvidia has access to, and can change any behavioral parameter inclu +-v tables if they so choose in design, perhaps a hidden" 0.9125 " table or say a 1.00V=0.00V_% option that can be driver accessed and set to say 0.93..... specifically for cuda programers ???? Remember....laptop chips are undervolted, under clocked desktop chips.......all out of same barrel, just put into one of the two form-factors and adjusted accordingly.:50_002:

Using which vbios and which OC program? I'll give it a shot. I'm currently using a 0.92v vbios for my 580s.

Link to comment
Share on other sites

Remember....laptop chips are undervolted, under clocked desktop chips.......all out of same barrel, just put into one of the two form-factors and adjusted accordingly.

Wrong. You're missing a big point here. Even though some laptop cards use the same silicon as desktop cards, the whole circuitry of the mobile cards are different, they don't even come close to their desktop equivalents, especially the high-end mobile cards. There's simply no room for such fancy circuit designs. Unlike some desktop cards, mobile cards don't come with programmable voltage regulators, the voltages are hardwired and in most cases it can only be tweaked in a very limited range.

A chip doesn't just come "undervolted", it gets limited by the controlling and voltage supply circuits on the PCB.

Nvidia can't just do some magic tricks and increase the voltage of the card above it's hardware limit. And also Nvidia won't increase the voltage of a card per new driver release, in a lot of systems this would result in cards running above the thermal design points of the system.

And if you read "0.93V" on your 555m in the M14x, then it's simply an inaccuracy of the monitoring software. The voltage doesn't get measured, the monitoring software only displays a value it reads out from the driver or a different interface.

Link to comment
Share on other sites

Wrong. You're missing a big point here. Even though some laptop cards use the same silicon as desktop cards, the whole circuitry of the mobile cards are different, they don't even come close to their desktop equivalents, especially the high-end mobile cards. There's simply no room for such fancy circuit designs. Unlike some desktop cards, mobile cards don't come with programmable voltage regulators, the voltages are hardwired and in most cases it can only be tweaked in a very limited range.

A chip doesn't just come "undervolted", it gets limited by the controlling and voltage supply circuits on the PCB.

Nvidia can't just do some magic tricks and increase the voltage of the card above it's hardware limit. And also Nvidia won't increase the voltage of a card per new driver release, in a lot of systems this would result in cards running above the thermal design points of the system.

And if you read "0.93V" on your 555m in the M14x, then it's simply an inaccuracy of the monitoring software. The voltage doesn't get measured, the monitoring software only displays a value it reads out from the driver or a different interface.

ok. this ceased to be productive....but once again driver 302.59 raised my reported voltage to 0.93 for anyone who wants to test,confirm,etc. could be interesting guys :)

Link to comment
Share on other sites

ok. this ceased to be productive....but once again driver 302.59 raised my reported voltage to 0.93 for anyone who wants to test,confirm,etc. could be interesting guys :)

I'm no pro so the above is most likely accurate on that the .93v is not correct. to find out use something other than GPU-z to read it.. as on my 580m on stock dell .87v GPU-z shows .92v sometimes... which i knew was completely false.

Link to comment
Share on other sites

I'm no pro so the above is most likely accurate on that the .93v is not correct. to find out use something other than GPU-z to read it.. as on my 580m on stock dell .87v GPU-z shows .92v sometimes... which i knew was completely false.

hi, havnt tried gpuz yet, just evga precision and gpu observer, along with a marked improvement in overclock :)

Link to comment
Share on other sites

hi, havnt tried gpuz yet, just evga precision and gpu observer, along with a marked improvement in overclock :)

maybe as SVL7 mentioned... we could have a few other users report if they see the same improvement with the drivers... either way the improvement has occured and 40mhz is a nice jump. Would be nice to see more users report back.

Link to comment
Share on other sites

maybe as SVL7 mentioned... we could have a few other users report if they see the same improvement with the drivers... either way the improvement has occured and 40mhz is a nice jump. Would be nice to see more users report back.

Agreed............good sir, just looking to spread info and make peoples gaming a little bit better:78:

  • Thumbs Up 1
Link to comment
Share on other sites

  • 4 weeks later...

Main post updated, added two new versions of A08 (standard and unlocked) which bring back the SATA performance of BIOS A05. This allows GF116 users to enjoy the benefits of A05 as well, and brings a new option to the GF106 users who chose to stay on A05 due to the SATA behavior of A07 and higher.

Enjoy.

  • Thumbs Up 2
Link to comment
Share on other sites

  • 4 weeks later...

Thanks for the continuous updates in spite of the fact that most people's efforts are going into improving the 600m series!post-3452-14494993351561_thumb.jpg

I also wanted to add that; since the last few Nvidia driver updates have come out all programs are reading a voltage of 0.930 volts compared to the 0.912 volts when I first started using the A08 bios for the GF116.

  • Thumbs Up 1
Link to comment
Share on other sites

Thank you for the feedback! :)

Regarding the voltage I stick with my theory, the vbios didn't change and thus the voltage table didn't change either. It seems that Nvidia changed something in it's API which causes it to return a different value. I can't say whether 0.93 or 0.912v is more accurate...

Link to comment
Share on other sites

  • 2 weeks later...

Great work, I really appreciate you doing what others are afraid of. I'll give the unlocked A08 a flash and get back to you on what I end up with as far as OC'ing goes after I repaste my heatsinks.

Thanks again

Link to comment
Share on other sites

  • 2 weeks later...

Ok ! And i have another question : what's the best bios after tests ? the last A08 or the A05 ? I bought an old M14xR1 with first 555M GT 3Gb and no SSD HD so i don't see why the A08 is good for me.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.