Jump to content

GTX970 power consumption discussion


MVC

Recommended Posts

I have been unable to make a GTX970 draw more then 150 Watts. In the cMP there is the ability to monitor power draw. I doubt VERY MUCH that any amount of overclocking could get one to 225 Watts.

They sip power, are wonderful cards.

Link to comment
Share on other sites

@MVC: In Furmark I could measure very high load peaks with an overclocked GTX 970 (Combination of GPU, ATX PSU, AKiTiO [without PSU], molex powered riser) :

post-26280-14495000186901_thumb.jpg

http://forum.techinferno.com/implementation-guides-apple/7879-2013-13-macbook-pro-gtx970%4016gbps-tb2-akitio-thunder2-win8-1-osx10-10-%5Bdschijn.html

But that is, of course, a worst case scenario and not going to happen with gaming or GPU computing. Haven't measured the load with my current build and the DA-2 yet…

Link to comment
Share on other sites

@MVC: In Furmark I could measure very high load peaks with an overclocked GTX 970 (Combination of GPU, ATX PSU, AKiTiO [without PSU], molex powered riser) :

[ATTACH=CONFIG]15057[/ATTACH]

http://forum.techinferno.com/implementation-guides-apple/7879-2013-13-macbook-pro-gtx970%4016gbps-tb2-akitio-thunder2-win8-1-osx10-10-%5Bdschijn.html

But that is, of course, a worst case scenario and not going to happen with gaming or GPU computing. Haven't measured the load with my current build and the DA-2 yet…

I'm sorry, you're right.

I give up.

Link to comment
Share on other sites

I'm sorry, you're right.

I give up.

Too bad, was preparing my arguments :D

I must admit, the power consumption is a worst case scenario and includes the AKiTiO and the efficiency loss of the PSU, but still Nvidias specs of 145W TDP aren't right, as there is no reference GTX 970.

Link to comment
Share on other sites

I have several reference 970s. Look like reference 980 but have "970" on them.

And I would say that there is no rational way to get them to use anywhere near the power you are stating.

Running Furmark far surpasses any real world use.

Running Furmark at 110% is silly and wasting people's time.

Why not just pour some water in the case and see how much power that uses?

What is the point of scaring people?

Link to comment
Share on other sites

I have several reference 970s. Look like reference 980 but have "970" on them.

And I would say that there is no rational way to get them to use anywhere near the power you are stating.

Running Furmark far surpasses any real world use.

Running Furmark at 110% is silly and wasting people's time.

Why not just pour some water in the case and see how much power that uses?

What is the point of scaring people?

Those 'reference' GTX970 cards are somewhat rare. US ppl can get them at Bestbuy NVIDIA GeForce GTX 970 4GB GDDR5 PCI Express 3.0 Graphics Card Silver 9001G4012510000 - Best Buy and newegg NVIDIA GeForce GTX 970 4GB Video Card (Require Min. 500W Power Supply) - Newegg.com . OS buyers have a hard time getting them. Besides, they are wider 2.5x width cards with their main benefit being their blower style cooler which suits certain cases and their look.

You'd need to performance a VBIOS mod to raise your reference GTX970's TDP to gain higher overclocking potential as well as hold boost clocks for longer. Details on how to do this are at How to Raise the Power Target Limit on GeForce GTX 970 and GTX 980 - Crypto Mining Blog .

Other non-reference GTX970 have the VBIOS set TDP to be far greater than the 145W quoted by NVIdia. Eg for the Gaming GTX970: ASUS - max 196w, MSI - max 220w, Gigabyte - max 280w. REF: http://www.**************/t/1516121/gtx-970-comparison-strix-vs-msi-gaming-vs-gigabyte-g1 (that's overclock dot net)

Last I checked the VBIOS of the GALAX GTX970 being used at http://forum.techinferno.com/implementation-guides-apple/10099-2012-15-macbook-pro-gtx970%4010gbps-tb1-akitio-thunder2-win7-osx10-10-3-%5Bbsohn%5D.html and http://forum.techinferno.com/implementation-guides-apple/9821-2014-15-macbook-pro-iris-gtx970%4016gbps-tb2-akitio-thunder2-win8-1-%5Bdschijn-2%5D.html was 200W + 12.5%. I would recommend not going past 110% (220W) in the power target for OC software (MSI/EVGA) as you are then sitting at 220W. With approx 10W overhead for the TB circuitry your at 230W. The Dell DA-2 is officially specced at 220W, though a taobao vendor and my own testing confirm it can nudge 240W. Anymore and it triggers a self-preserving auto power down.

Link to comment
Share on other sites

@MVC: Yes, Furmark is just a stress test to simulate the "worste case scenario". Quite common to test and check if the system, OC or cooling is stable.

Also, there is and never was a reference GTX 970. The one you are talking about is the result of a community project from, i think, Overclockers UK. They used the reference GTX 980 layout with the GTX 970 chip, manufactured by Manli in China.

My point was that no GTX 970 is sticking to the TDP of 145W, as EVERY single GTX 970 is factory OCed. Tests from tomshardware have shown that all GTX 970 are using power between 170-190W in gaming and 190-210W during torture tests (average values).

I am using my cards even beyond that specs by overclocking and increasing the power limit. And that is very common too :)

Link to comment
Share on other sites

Also, there is and never was a reference GTX 970. The one you are talking about is the result of a community project from, i think, Overclockers UK. They used the reference GTX 980 layout with the GTX 970 chip, manufactured by Manli in China.

My point was that no GTX 970 is sticking to the TDP of 145W, as EVERY single GTX 970 is factory OCed. Tests from tomshardware have shown that all GTX 970 are using power between 170-190W in gaming and 190-210W during torture tests (average values).

I am using my cards even beyond that specs by overclocking and increasing the power limit. And that is very common too :)

Guess I'm imagining these cards?

Here is a test where they imagined they had them too:

http://www.guru3d.com/articles_pages/geforce_gtx_970_sli_review,4.html

Note that they gauged power draw at max load as being 164 Watts.

Which is of course quite a lot for an imaginary card, but oddly almost exactly what I got when I measured them a few weeks back:

http://forums.macrumors.com/showthread.php?t=1879743

I actually got 169 Watts, the 5 extra watts may also have been my imagination. :adoration:

post-31171-14495000189753_thumb.jpg

Link to comment
Share on other sites

Ok, that card is running within the official GTX 970 reference specifications, that is true. It got the right power target and clock speeds. Still, it is not the official reference card. It is a Overclockers UK project: running a GTX 970 chip on the GTX 980 reference board with the GTX 980 reference cooler with the reference specs. So it kind of is a reference 970, but a custom made one.

Besides, almost everybody is getting a different branded card and these are beyond the specs. So sticking to the "reference GTX 970" doesn't let you make a point, because it isn't crucial for everyone else. All the cards we are talking about here are beyond the TDP of 145 or 165W.

Link to comment
Share on other sites

Dude, it's a reference 970.

Available in US.

It came in an Nvidia box.

Reference 970. I have several, have more coming. I don't live in UK. They are available in US.

And the non-reference 970s I tried couldn't draw more power in a cMP, there limited to PCIE 2.0 x16 lanes.

In any given TB eGPU, they get a quarter of that bandwidth, at best.

I ran Furmark on one screen, and power monitor on other.

Never got close to 200 Watts. Keep in ind I am referring to OSX draw. I don't know that anyone else has 970 running Furmark in OSX. Furmark and "power virus" show up together in Google searches for a reason.

Link to comment
Share on other sites

Still, it is not the official reference card. It is a Overclockers UK project: running a GTX 970 chip on the GTX 980 reference board with the GTX 980 reference cooler with the reference specs. So it kind of is a reference 970, but a custom made one.

post-33851-14495000190374_thumb.jpg

Link to comment
Share on other sites

@eeevan: Dafuq?

Eat this: https://www.overclockers.co.uk/showproduct.php?prodid=GX-205-OK

We have been impressed with the wide cross section of GTX970’s and GTX980’s we have reviewed since Nvidia’s official launch. It was interesting to note on launch day that Nvidia didn’t sample a reference GTX970, instead relying on partners to supply custom cooled overclocked solutions. Nvidia’s distinctive reference cooler has a devoted, loyal user base … who were royally miffed that a GTX970 reference card was not available. Andrew ‘Gibbo’ Gibson in Overclockers UK was keen to resolve the problem and he worked hard to create an ‘improved’ reference design – a worldwide exclusive.

Source: http://www.kitguru.net/components/graphic-cards/zardon/ocuk-geforce-gtx-970-nvidia-970-cooler-edition-review/

Edit: Ok there seems to be a reference card, but rarely available… http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970

In the early days of the GTX 970, there was no reference card available. A lot of people tried to get their hands on one, but there was none!

Later Overclockers UK created a reference GTX 970 by themself. They took the GTX 970 chip and other parts and hocked it up on a GTX 980 reference board with the GTX 980 cooler.

Most boardpartners used either the GTX 980 PCBs or old GTX 670 PCBs!

Still… my benchmarks are not useless. Imho it is useless to claim that all GTX 970s are using little power. All board partners are using the chips over the official specifications and increased the power limit. By that they withdraw more power. I tested, like I mentioned before, the peak values in a worst case scenario (Furmark).

Since all users here are using non refernce cards, who is closer to the truth?

  • Thumbs Up 1
Link to comment
Share on other sites

I just measured mine with a wattage monitor directly from outlet.

KFA2 gtx 970 + dell da 2

standby 0.7w

idle 25-40w

stock metro LL benchmark 165w to 198w (fluctuating a lot)

OC metro LL benchmark 170w to 211w

  • Thumbs Up 2
Link to comment
Share on other sites

I just measured mine with a wattage monitor directly from outlet.

KFA2 gtx 970 + dell da 2

standby 0.7w

idle 25-40w

stock metro LL benchmark 165w to 198w (fluctuating a lot)

OC metro LL benchmark 170w to 211w

Excellent.

When you deduct the amount needed for the Akitio itself, you end up with what I found, the card is always drawing less than 200 Watts.

My perspective on this is different from you guys. My business is all about cards that go in the cMP (cMP = Classic Mac Pro = Cheese Grater)

There we have a limit of 2 @ 6 pin power connectors on the board. When I sell EFI modded cards they have to live on just these 2 connections FOR YEARS, and what they can draw from the slot. It is in fact CRUCIAL that they not draw too much as that leads to crashing or burned traces on logic board, which would quickly catch up with me. Basically I have found that MANY cards draw too much from either the slot or the cables, but the cMP can take short hits of 100 Watts from the slot or 120 or so from a cable, but not for extended times.

There are capacitors all over the cards, the logic board, and in the power supply that are designed to absorb these peaks. They function similarly to small rechargeable batteries, except they only last for a brief moment. They can give a quick boost, but then they need extra capacity to recharge again. This is how things work.

So a brief spike is no problem, by design. Lasting longer than a second or two is when you have problems.

So every generation of cards I test quite rigorously to find if they will run stably in a cMP. For all intents and purposes, a card that regularly draws 230-235 Watts can live in cMP. Beyond 240-250 and it will shut the machine down. I thus use Furmark in OSX as my litmus test. If a card shuts the cMP down the second Furmark launches, I label it as "External Power Only" and don't include the special Mac power cables. I would sell MANY more classic Titans and Titan Blacks if I didn't insist on this. People write me frequently wanting to be told that they can run these cards on internal power, if I were irresponsible I would say "Yes, go for it !". Instead I tell them the truth.

The Maxwell cards are unique in that even the Titan-X can run Furmark in OSX, and finish. The Kepler Titans literally shut the machine down in under a second. The bar across top gets drawn, the machine prepares to render it full screen, and CLICK, off goes the Mac Pro.

Some time back, while we were still working on the EFI for the Maxwell acrds a guy came on MR and claimed that the 970s were such low power draw that he could run 2 of them, all on internal power alone. He did this just by splitting each of the 6 pin cables into 2. At first I thought he was crazy, and glossing over issues. Once we got the EFI done, I gave it a shot in the linked thread. And I was amazed. I have had probably 10 different 970s in my hands, I have not found one that draws significantly more power then the others. But I haven't obsessed on this since 980 and Titan-X are fine, little reason to worry about 970.

OSX is a limiting factor, as is the fact that cMP is PCIE 2.0, so the Maxwell cards can't fully "stretch their legs" like they could in a Windows PCIE 3.0 machine. So max draw figures under PCIE 3.0 in WIndows don't really apply to OSX usage, or even eGPU usage in that the cards never ever can transmit more than PCIE 2.0 x4 lanes worth of Data.

And I will give you some interesting things I have noticed over the years. The GTX280 was a 6 pin and an 8 pin card. When Apple tried to develop GT200 cards, they instead concentrated on the GTX260 for this reason. I have a few of the Apple development GTX260s, complete with mDP port. You can still find references to "05e2" in the OS if you know where to look, today in Yosemite.

But they skipped this first iteration and ended up going with the refined GTX285, which went back to using 2 @ 6 pins. So, Apple was able to use latest and greatest GPU instead of 2nd tier. Where it gets interesting is if you monitor power draw of GTX285 under load. The 6 pin that was an 8 pin on the 280 will always draw 20 or more watts then the one that was always a 6 pin. And despite being technically a "less then 225 Watt" card, it can draw more. It will frequently pull 90-100 Watts from the 6 pin plug that was an 8 pin on 280.

As much as we wish this stuff was cast in stone, it isn't. Some cards get an 8 pin plug despite being nearly incapable of drawing anywhere near that much current. To some extent it is like a 4 cylinder car that has dual exhausts under it's bumper. It LOOKS like a powerful car. Sometimes those 8 pins aren't needed to such an extent that the card maker doesn't even have a sense pin for 8 pins. Many GTX680 and GTX770 cards have such a setup.

Is it possible to make a card draw more current via overclocking? Sure, but that isn't my thing, at all. I design roms and cards to work in a known environent. Every card I sell can potentially come bite me in the butt. So it is very important to me to keep an eye on power draw BEFORE I send them out.

TEsting on 970s I was UNABLE to make them trigger the cMP circuit breaker, even when using a 6 pin split across the 2 plugs and running Furmark at full res for extended time. During this run I never observed the card drawing more than 180 Watts, typically it stayed around 150-165.

With my recent testing of AMD cards I have come to realize that for eGPU purposes, the amount that gets drawn from the PCIE slot itself can GREATLY alter stability of eGPU. And this is why Maxwell has been a stable choice, they rarely draw more than 50 Watts from slot, while AMD cards go well beyond that. The cMP is nice because you can monitor power draw from all 3 sources as separate feeds. I'm pretty certain now that goalque's find of instability on R9 cards is due to PCIE slot draw, but that is for another thread.

I have only found one other regularly available 970 that is built on =reference board, it is from MSI. The clocks are 50Mhz over reference.

I have preferred the cards with 3 DP or 3 mDP for the reason that these offer the best future options for displays. I have a Dell 5K and only cards with 2 DP or more can use it, so I have avoided the 970 and 980 cards with just 1 DP port.

Link to comment
Share on other sites

With my recent testing of AMD cards I have come to realize that for eGPU purposes, the amount that gets drawn from the PCIE slot itself can GREATLY alter stability of eGPU. And this is why Maxwell has been a stable choice, they rarely draw more than 50 Watts from slot, while AMD cards go well beyond that. The cMP is nice because you can monitor power draw from all 3 sources as separate feeds. I'm pretty certain now that goalque's find of instability on R9 cards is due to PCIE slot draw, but that is for another thread.

It’s in fact opposite. R9 280X and HD 7970 draw much less from the slot:

http://forum.techinferno.com/enclosures-adapters/7205-us%24189-akitio-thunder2-pcie-box-16gbps-tb2-97.html#post122115

Tom's hardware tested MSI R9 290X Lightning with an oscilloscope and say:

“We like that neither AMD nor Nvidia max out the PCIe slot connector's output rating, which is 75 W. Those auxiliary power cables bear the brunt of the load. Nor are there drastic load transients on the motherboard connector. All of this helps ensure system stability, benefiting multi-GPU setups in particular.”

Power Draw: Test System And Methods - MSI R9 290X Lightning Review: The Right Way To Cool Hawaii

You can see from the graph that it’s ONLY 32W from the PEG mainboard (12 Volts). Pretty close to my values when we deduct the TB card draw (5W-10W).

R9 270X consumed more interestingly, but behaved the same as R9 280X. Crash. However, not close to values that GTX 780 gave.

I repeated the tests 3 times to be sure, and that column values were rounded to nearest integer.

Remember that my tests or anything else can’t be generalised, there is a lot of variation depending on the vendor. Asus GTX 960 Strix is a good candidate if you like to fry your board, see those spikes, ouch:

Problems at the Motherboard Slot:

“Asus GTX 960 Strix leaves the motherboard connector to deal with unprecedented unfiltered power spikes all on its own:”

post-28870-14495000194185_thumb.png

Power Consumption Details - Nvidia GeForce GTX 960: Maxwell In The Middle

Link to comment
Share on other sites

Maxwell indeed is amazing regaring power consumption!

So would you say that a high clocked GTX 750Ti without any additional PCIe power cable will probably not work, if it frequently will requiere more than 75W in peaks (with a propper PSU)?

Link to comment
Share on other sites

Maxwell indeed is amazing regaring power consumption!

So would you say that a high clocked GTX 750Ti without any additional PCIe power cable will probably not work, if it frequently will requiere more than 75W in peaks (with a propper PSU)?

There are 750Ti cards which exceed the 75W slot limit as the above mentioned Asus GTX 960 Strix.

Link to comment
Share on other sites

It’s in fact opposite. R9 280X and HD 7970 draw much less from the slot:

http://forum.techinferno.com/enclosures-adapters/7205-us%24189-akitio-thunder2-pcie-box-16gbps-tb2-97.html#post122115

Tom's hardware tested MSI R9 290X Lightning with an oscilloscope and say:

“We like that neither AMD nor Nvidia max out the PCIe slot connector's output rating, which is 75 W. Those auxiliary power cables bear the brunt of the load. Nor are there drastic load transients on the motherboard connector. All of this helps ensure system stability, benefiting multi-GPU setups in particular.”

Power Draw: Test System And Methods - MSI R9 290X Lightning Review: The Right Way To Cool Hawaii

You can see from the graph that it’s ONLY 32W from the PEG mainboard (12 Volts). Pretty close to my values when we deduct the TB card draw (5W-10W).

R9 270X consumed more interestingly, but behaved the same as R9 280X. Crash. However, not close to values that GTX 780 gave.

I repeated the tests 3 times to be sure, and that column values were rounded to nearest integer.

Remember that my tests or anything else can’t be generalised, there is a lot of variation depending on the vendor. Asus GTX 960 Strix is a good candidate if you like to fry your board, see those spikes, ouch:

Problems at the Motherboard Slot:

“Asus GTX 960 Strix leaves the motherboard connector to deal with unprecedented unfiltered power spikes all on its own:”

[ATTACH=CONFIG]15081[/ATTACH]

Power Consumption Details - Nvidia GeForce GTX 960: Maxwell In The Middle

After additional testing I don't know what to think. Overall I have to come to the conclusion that AMD Tahiti and Hawaii have issues being used as eGPU in Akitio and can't recommend them in any situation. It is something about power management, cards not being in power state that situation calls for in OSx and Windows. This may or may not be fixable.

Way to find answer would be to find people using these as eGPU who don't have these issues and find what is different. If a different TB chassis is immune, then find what is different from Akitio.

Keep in mind that power management in OS X is defined by machine, and there is no reason to specify AMD Tahiti power management in a 2014 Mini.

I can do some 750Ti tests, I have a couple different ones here.

Link to comment
Share on other sites

I think it will be very hard to find people using AMD cards, if most of the guides are based on using Nvidia cards. When they read about problems with AMD cards, they will probably play safe and get an Nvidia.

A 750Ti test would be great.

Link to comment
Share on other sites

I may be late for this discussion as it moved to GTX 750Ti, but I want to post some link in order to provide more information on GTX 970 power consumption. So here is the link: Nvidia GeForce GTX Watchdogs Results

Also check page 11 through 13 from that article.

The link was taken from this post http://forum.techinferno.com/diy-e-gpu-projects/8304-gtx970-dell-da-2-test-results.html which contains useful information by itself.

Link to comment
Share on other sites

I've just experienced my first bsod due to overdrawn power and instability.

I'm working on using some 16v capacitor to fix the instability.

I just need to find someone to approve my design, as I do not have a electrical education.

Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.