Jump to content
  • NVIDIA GTX 1080 PCB Run Down


    Brian

    With NVIDIA having now officially launched their next generation Pascal based GTX 1070 and 1080 GPUs, there are a number of AIB partner boards coming to the market soon. We have a video today by YouTuber Actually Hardcore Overclocking that examines different custom PCBs and helps break down the differences between the various boards and their power delivery designs. If you've read about additional power phases that will be available on AIB cards that are lacking in Founder's Edition but never quite knew what that meant, this video should help. 

     

     


        Report Article


      User Feedback


      This guy mostly knows what he's talking about. He was right in ignoring the number of PCI-E power connectors. Those have zero impact on overclocking and more just makes the card more power efficient. He did properly identify phases, which many people don't, but he did annoyingly use the recent Nvidia nomenclature defining phases which is misleading. Ex. when saying 8+2 he's saying there's 8 core phases and 2 memory phases. This is only the recent meaning of that phrase. In the past 8+2 meant you have 10 core phases, with 8 switching in sequence plus another 2 switching in their own sequence, both summing up to deliver current to the core. For example the ATI 4890 was a 3+2 configuration for the core. Minor thing, but using Nvidia's misleading marketing terms annoys me.

       

      For bigger things that I think he did wrong, he ignored cap quality and vdroop. Many "overclocking" cards try to have big impressive looking phases with big electrolytic caps that look cool. The problem is of the 3 main cap types you can have the big radial electrolytic caps are the lowest quality. Surface mount electrolytics tend to have superior ESR (effective series resistance. this matters for sudden load changes, or a phase switch) than the round radial electrolytics, but are more expensive and look less cool. Radials do tend to have higher capacitance than SMD, but usually electrolytic capacitance is overkill by nearly an order of magnitude on "overclocking" desktop cards, with ESR being more of a limiting factor. Even better ESR are ceramic caps, but these are the smallest and most expensive. Many of these boards completely lacked ceramic caps for the core on the front of the pcb. Maybe they made up for it on the back, but its obvious that many manufacturers are trying to make their boards look cool.

       

      As for vdroop, many of these boards screwed over the memory. All high phase count cards except the zotac threw the memory phases way to the right of the card. This gives the current a long distance to travel, and even worse is the current needs to travel under the core phases, which will already be using many pcb layers. This also hurts core vdroop because pcb layers that the core would otherwise receive are needed to route the memory power. Then add in that 3 memory chips also have data and address lines between the core and power phases, and you have one overly worked pcb.

       

      In short, the ASUS looked like crap to me. No top side ceramic caps and no SMD electrolytics. The powercolor card looked good for air and H20 with a good ceramic cap count, but seemed to lack the FETs to push the current or LN2. The EVGA should be good for LN2 if the backside makes up for the front side's lack of ceramics. Its electrolytic count is insane with tons of SMD caps, but it looks like it will have vdroop problems. The MSI looks decent with a good chunk of ceramics and high phase count, but lacks SMD electrolytics.

       

      I don't get why he gave the Zotac so much crap. This card looked to be the best to me. That memory phase location is great. The PCI-E slot can route current up the left side so it can get a strong 12V connection that avoids messing with the core power side of the card and memory data and addressing. The Zotac will have the best vdroop of any card. It has a good count on ceramics and a few SMD electrolytics. Yes the FETs COULD be poor, but they could also potentially be better than the high/low combo chips on most cards.

       

      Honestly the 25A high/50A low chip that he said Nvidia is using sounds like crap. Having a lower high side current limit than low side makes no sense. Yes high side conducts current way less often than low side so you want a smaller faster switching FET, but if you exceed that 25A the FET can still blow up. If the low side conducts 50A when it is on, then the high side also conducts 50A, so this chip is only rated for 25A due to the high side. I can see why manufacturers would avoid this chip. I'm surprised I didn't see the TI 83750 high/low combo chip on any cards as this has a 40A rating and is very commonly used on mobile cards.

      Share this comment


      Link to comment
      Share on other sites


      Create an account or sign in to comment

      You need to be a member in order to leave a comment

      Create an account

      Sign up for a new account in our community. It's easy!

      Register a new account

      Sign in

      Already have an account? Sign in here.

      Sign In Now

    1. Similar Content

      • By ounces
        After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

        It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

        GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
        Banding was not present with the original TN panel using the same GPU.
         
        While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
         

         
        My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

        ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
         
        Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
         
        Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
         
        Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

         
        So far, we have two strings for BIOS version:
        80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
        Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
         
        ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

         
        Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

         
        "NVidia inspector" is of the same opinion:

         
        Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

         
        Here we have two three interesting findings:
        Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
        Loading the same file in "Kepler BIOS tweaker" confirms the facts:

         
        EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
        ...Shall I be worried about "60.02Hz" refresh rate?
         
        To summarize:
        Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
        Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
        K4000M.zip
      • By Blocker35
        Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
         
        - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
        - Razer X Core 
        - Apple TB3 to TB2 Adapter
        -  TB2 Cable (Cable Matters)
        - 23inch AOC External Monitor
         
        I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
        I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
        Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
      • By handale30
        I added this post in the another forum a few months ago but felt like it could also help someone here, enjoy... So ive owned quite a few Alienware laptops but out of all of them my favorite has always been the great m17x R4. I love its aggressive looks, its lines and grills remind me of lamborgini design, the 1 glass sheet screen, the media keys looked futuristic and its laser etched name plate added a touch of personalization and attention to detail that they used to have. So considering it still has powerful specs for modern standards, i7 processors, 32gb ram, 2ssd and 1msata slots, 120hz 3D display, mxm slot, sd card reader, I couldn't help but to sell my 2 current laptops (15 R1 980m, 4k, 15 R3 1060) and purchase a pristine example of this machine with all its original packaging to upgrade it up to todays specs.   The Original Specs of this particular system were:  * Intel i7 3820qm 2.7ghz / 3.7ghz  * Nvidia GTX680m 2gb  * 8gb DDR3 RAM 1600mhz CL11  * 250gb Samsung 840pro SSD  * 1tb Seagate Momentum HDD  * 17" 1080p 120hz 3D display Desired Upgrades:  * Intel i7 3940mx 3.0ghz / 3.9ghz  * Nvidia GTX 1070 8gb  * 32gb DDR3 Ram 1600 CL10 Corsair  * 1tb Samsung 860 EVO SSD  * Liquid Metal and better thermal pads    2019 Specs: * Intel i7 3940mx 3.0ghz / 3.9ghz  * Zotac GTX 1070mxm 8gb  * 32gb DDR3 Ram 1866 CL9 Corsair Vengeance  * 1tb Samsung 860 Evo SSD  * 256gb Samsung 840 Pro SSD  * Liquid Metal and Fujipoly 17w/mk thermal pads  * 17.3" 1080p 120hz 3D display   I already purchased the GTX 1070 mxm, 32gb RAM and the 1tb SSD. The GTX1070 is installed and working at its full power. I applied Thermal Grizzly Conductonaut liquid metal thermal compound to both CPU and GPU and Fujipoly ultra extreme 17.0 W/mK thermal pads with my modified GPU heatsink and new X-bracket. CPU is overclocked from OEM BIOS to stage1    4.1ghz  3.9ghz 4cores. No chasis or board was cut or damaged during this process as I wanted to keep its integrity and no throttle issues what so ever. As I already had the 120hz edp screen, it was an easier job to do and thanks to @Striker123 and to @D4ddy for their threads as they made me have the guts to go through the project and to Rick for selling me this perfect example of a computer.   Alienware m17x r4 1070 mxm http://forum.notebookreview.com/threads/msi-gtx-1070-mxm-successfully-working-on-alienware-m17x-r4-another-socket-victory-against-bga-crap.803637/ Alienware m17 Ranger 1070 mxm http://forum.notebookreview.com/threads/alienware-17-r5-gtx-1070-mxm-3-1b.800137/   Here are some before and after benchmarks so you can see this baby flying let me know what you think about these numbers... I will be uploading more pictures soon as it only lets me upload 5. If you guys have any questions id be glad to help out.   current issues:    - the GPU fan does not automatically work so I have to use HWinfo to manually build the fan table but HWinfo sucks balls at this and doesn't start them automatically either, also with HWinfo I take control of both CPU and GPU fans at the same time and sometimes I just want my CPU fan to be spinning. so if someone has a fix for this man that would be great!    
      • By Kel
        I got the exp gdc beast 8.5 got all the connections right... laptop boots up... fans turn on the gpu.... device manager detects gtx 1060 with a “ ! “ in a yellow triangle. Now what to do.... i tried to do the bios mod.... but realised that i have 3.07 version. 
        Questions-
        1 is there any way to get this done without the mod?? 
        2 if mod is the only way... kindly help me with step by step instruction in laymans terms... i m not a techy... so i might not understand technical terms.... altho i am interested... but learning yet. Please help. 
      • By thechillhacker
        Hello all
         
        I have a Clevo P370EM3 laptop with a pair of GTX 680Ms in SLI mode and have never been able to get Linux to work properly with full acceleration and without tearing. I have tried every xorg config I can find, and even all the modifications from the nvidia configurator to no avail. Has anyone been able to get these things to work 100% in Linux? I have typically used ubuntu based distros, including Mint, and every available driver version from the repositories, but can never get it to work quite right.
         
        Thanks for any help anyone can provide. It is killing me to run Windows on this machine because of these issues, and I am almost to the point of installing the inferior 7970 cards.
         
        Have a great day
        -TCH
    ×

    Important Information

    By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.