• NVIDIA GTX 1080 PCB Run Down


    Brian

    With NVIDIA having now officially launched their next generation Pascal based GTX 1070 and 1080 GPUs, there are a number of AIB partner boards coming to the market soon. We have a video today by YouTuber Actually Hardcore Overclocking that examines different custom PCBs and helps break down the differences between the various boards and their power delivery designs. If you've read about additional power phases that will be available on AIB cards that are lacking in Founder's Edition but never quite knew what that meant, this video should help. 

     

     




      User Feedback


      This guy mostly knows what he's talking about. He was right in ignoring the number of PCI-E power connectors. Those have zero impact on overclocking and more just makes the card more power efficient. He did properly identify phases, which many people don't, but he did annoyingly use the recent Nvidia nomenclature defining phases which is misleading. Ex. when saying 8+2 he's saying there's 8 core phases and 2 memory phases. This is only the recent meaning of that phrase. In the past 8+2 meant you have 10 core phases, with 8 switching in sequence plus another 2 switching in their own sequence, both summing up to deliver current to the core. For example the ATI 4890 was a 3+2 configuration for the core. Minor thing, but using Nvidia's misleading marketing terms annoys me.

       

      For bigger things that I think he did wrong, he ignored cap quality and vdroop. Many "overclocking" cards try to have big impressive looking phases with big electrolytic caps that look cool. The problem is of the 3 main cap types you can have the big radial electrolytic caps are the lowest quality. Surface mount electrolytics tend to have superior ESR (effective series resistance. this matters for sudden load changes, or a phase switch) than the round radial electrolytics, but are more expensive and look less cool. Radials do tend to have higher capacitance than SMD, but usually electrolytic capacitance is overkill by nearly an order of magnitude on "overclocking" desktop cards, with ESR being more of a limiting factor. Even better ESR are ceramic caps, but these are the smallest and most expensive. Many of these boards completely lacked ceramic caps for the core on the front of the pcb. Maybe they made up for it on the back, but its obvious that many manufacturers are trying to make their boards look cool.

       

      As for vdroop, many of these boards screwed over the memory. All high phase count cards except the zotac threw the memory phases way to the right of the card. This gives the current a long distance to travel, and even worse is the current needs to travel under the core phases, which will already be using many pcb layers. This also hurts core vdroop because pcb layers that the core would otherwise receive are needed to route the memory power. Then add in that 3 memory chips also have data and address lines between the core and power phases, and you have one overly worked pcb.

       

      In short, the ASUS looked like crap to me. No top side ceramic caps and no SMD electrolytics. The powercolor card looked good for air and H20 with a good ceramic cap count, but seemed to lack the FETs to push the current or LN2. The EVGA should be good for LN2 if the backside makes up for the front side's lack of ceramics. Its electrolytic count is insane with tons of SMD caps, but it looks like it will have vdroop problems. The MSI looks decent with a good chunk of ceramics and high phase count, but lacks SMD electrolytics.

       

      I don't get why he gave the Zotac so much crap. This card looked to be the best to me. That memory phase location is great. The PCI-E slot can route current up the left side so it can get a strong 12V connection that avoids messing with the core power side of the card and memory data and addressing. The Zotac will have the best vdroop of any card. It has a good count on ceramics and a few SMD electrolytics. Yes the FETs COULD be poor, but they could also potentially be better than the high/low combo chips on most cards.

       

      Honestly the 25A high/50A low chip that he said Nvidia is using sounds like crap. Having a lower high side current limit than low side makes no sense. Yes high side conducts current way less often than low side so you want a smaller faster switching FET, but if you exceed that 25A the FET can still blow up. If the low side conducts 50A when it is on, then the high side also conducts 50A, so this chip is only rated for 25A due to the high side. I can see why manufacturers would avoid this chip. I'm surprised I didn't see the TI 83750 high/low combo chip on any cards as this has a 40A rating and is very commonly used on mobile cards.

      2 people like this

      Share this comment


      Link to comment
      Share on other sites


      Create an account or sign in to comment

      You need to be a member in order to leave a comment

      Create an account

      Sign up for a new account in our community. It's easy!


      Register a new account

      Sign in

      Already have an account? Sign in here.


      Sign In Now

    1. Similar Content

      • By Adraesh
        Hi all,
         
        I own an Alienware MX17 R4 (from 2012/2013 don't remember well) running with the official A12 bios and I am planning to change the discreet NVIDIA 675M because unfortunately it died.
         
        First and as I don't have so much money to spend on this, I am planning to change it by a 680M 4GO. Is this one a "good" choice ? Don't hesitate to propose me another GPU.
        Is the installation of a 680M 4GO will require a unlocked bios version ? (A11 unlocked ?).
         
        Do u know any website on which I can buy a 680M 4GO, I am really not a big fan of ebay...
         
        Thank you in advance.
      • By Madoka G
        Hello everyone,
         
        I am an M17X R4 owner running WIN10 AND currently upgrading my laptop from an 7970m to a 970m.
        I have already successfully transferred my 970m to the motherboard and windows has detected the the gpu in device manager, optimus mode is enabled and the intel graphics are also detected.
        The laptop is currenlty booting in full UEFI mode and running bios A11 stock.
         
        Currently I am experiencing issues with getting the 970m to run, my Intel graphics keep running only but i can see the 970m is installed and the settings can be changed from the nvidia control panel and can be benchmarked using FurMark.
        I managed to modify an nvidia driver using the device hardware ID's (not sure if that is the issue) but i think the main reason why the 970m is being used is because i require an unlocked Bios to change which GPU it runs on?
         
        Everything up until this point has been an interesting learning curve and i feel i have almost cracked it. 
        what do you all think, any advice would be much appreciated by myself and my laptop
        Im no expert on the matter but if anyone wants any advice from my experiences above i would be happy to help the best i can. Thx
         
         
      • By AleksR17x
        Updated inf specific for Alienware M17x R4 - 880 GTX
         
        384.76 - W10 64bit
         
        1198 - SUBSYS 057B1028
         
        Modded/updated Vbios highly Recommended
         
        Instructions (for Win 10 64bit):
         
        DDU have to be disabled (Windows Test Mode in right bottom corner)
         
        1.: Download NVIDIA Driver package
         
        http://us.download.nvidia.com/Windows/384.76/384.76-notebook-win10-64bit-international-whql.exe
        Let it unpack and cancel installation
         
        2.: Go to C:\NVIDIA\DisplayDriver\384.76\Win10_64\International\Display.Driver
        overwrite the supplied nvdmi.inf over the one in the folder.
         
        3.: Run setup.exe and let it install the new driver.
         
        nvdmi.rar
      • By FPSmadMEDIC
        Thanks to Joedirt recently for setting me up with a custom vBIOS for my ASUS ROG G750JM. I am having a little trouble figuring out how to adjust my voltage. Once the vBIOS is flashed the voltage is baked in so the slider does nothing. Joedirt took care of me plenty so I dont want to go hounding him for this advice and would rather ask the community. I can get the voltage to run at 0.987v or 1.193v one is not enough and the other is too much even with no OC it gets the GPU too hot too fast I would like to learn how to adjust the Voltage Table so that I can get 1.06-1.1v on my chip so I can OC and still not over heat. The stock vBIOS set it at 1.034v which I could OC my Memory +450mhz and my core +135mhz no problem. I would like to run at +280mhz or 1300mhz core and +395mhz Memory or 1450mhz Memory but that is not stable at 0.987v and gets to the 92c temp limit super fast with 1.193v. . .
      • By 09williamsad
        I have  Lenovo e540 with an EXP GDC Beast 8.4d and a GTX 660.
        The nvidia display driver crashes with a black screen for a few seconds when launching some games or when the egpu is used for a while, it sometimes will recover
        My external power supply is 350W and the card has a power consumption of 140W so I do not think this is a power issue.
         
        "Event 4101
        Display driver nvlddmkm stopped responding and has successfully recovered."
         
        What I have tried:
        A GTS450 instead of the GTX 660.
        Setting the PCIe port to Gen 1 in egpu setup.
        Setting the PCIe port to Gen 2 in egpu setup.
        Setting the Nvidia power management to maximize performance in the Nvidia control panel.
        Going back to the driver version 327.23, I got a blue screen instead of a crash when using this version.
        Disabling HDMI audio in the Nvidia control panel.
         
        Any suggestions or advice on this issue?
         
        Thank you