• [HARDWARE MOD]980m to Desktop 980 core upgrade


    Khenglish

    Due to a stupid accident by me, I acquired a 980m with a chunk knocked out of the core. Not wanted to scrap a perfectly good top end PCB for parts, I wanted to replace the core. You can see the gouge in the core to the left of the TFC918.01W writing near the left edge of the die.

     

    First I had to get the dead core off:

     

    uel9YeG.jpg

     

    With no sellers on ebay selling GM204 cores, my only option was to buy a full card off ebay. With no mobile cards under $500,I had to get a desktop card. And with this much effort involved to do the repair, of course I got a 980 instead of a 970.

     

    Below is the dead 980 I got off ebay:

     

    uBLQg5Q.jpg

     

    You can see for some reason someone removed a bunch of components between the core and PCI-E slot. I have no idea why anyone would do this. I tried the card and it was error 43. PCB bend seemed to be too little to kill the card, so those missing components had to be it.

     

    GPUs can be dead because someone removed or installed a heatsink wrong and broke a corner of the core off, so buying cards for cores on ebay is a gamble. This core is not even scratched:

     

    MhKJV4d.jpg

     

    Preheating the card prior to high heat to pull the core:

     

    FYMovFp.jpg

     

    And core pulled. It survived the pull:

     

    P1JD3jA.jpg

     

    Next is the 980 core on the left cleaned of solder. On the right is the original 980m core:

     

    PVmfFqE.jpg

     

    Next I need to reball the 980 core, and lastly put it on the card. I am waiting for the BGA stencil to arrive from China. It still has not cleared US customs:

     

    https://tools.usps.com/go/TrackConfirmAction?tLabels=LS022957368CN

     

    When that shows up expect the core to be on the card in 1-2 days.

     

    So some potential issues with this mod besides me physically messing up:

     

    I believe that starting with Maxwell Nvidia started flashing core configuration onto the cores, like intel does with CPUID. I believe this because I found laser cuts on a GK104 for a 680m, but could not find any on two GM204 cores. In addition, Clyde figured out device IDs on the 680m and K5000m. They are set by resistor values on the PCB. The 980m has the same resistor configuration as the 680m for the lowest nibble of the Device ID (0x13D7), but all of the resistors are absent. Filling in these resistors does nothing. Resistors do exist for the 3 and D in the device ID. Flashing a 970m vBIOS on my 980m did not change the device ID or core configuration. If this data is not stored on the PCB through straps or the vBIOS, then it must be stored on the GPU core.

     

    So I expect the card with the 980 core to report its device ID as 0x13D0. The first 12 bits pulled from the PCB, and last 4 from the core. 0x13D0 does not exist. I may possibly be able to add it to the .inf, or I may have to change the ID on the board. With the ID's 0 hardset by the core, I can only change the device ID to 0x13C0, matching that of a desktop 980.

     

    An additional issue may be that the core may not fully enable. Clyde put a 680 core on a K5000m and never got it to unlock to 1536 CUDA cores. We never figured out why.

     

    Lastly, there was very tough glue holding the 980m core on. When removing this glue I scraped some of the memory PCB traces. I checked with a multimeter and these traces are still intact, but if they are significantly damaged this can be problematic for memory stability. I think they are OK though, just exposed.

     

    Due to Clyde's lack of success in getting his 680 core to fully unlock I am concerned I might not get 2048. If I don't at least I should still have a very good chip. Desktop chips are better binned than mobile chips (most 980s are over 80% ASIC quality, while most 980ms are below 70%). In addition this 980 is a Galax 980 Hall of Fame, which are supposedly binned out of the 980 chips. Having a 90%+ ASIC would be great to have. The mid 60s chips we get in the 980m suck tons of power.


    I want to give a special thanks to Mr. Fox. This card was originally his. He sent me one card to mod and one to repair. I repaired the broken one and broke the working one. The broken one is the one I've been modding.

     

    Article update: SUCCESS!

    Core finally reballed. If the mount is poor I will be very very angry...

    20160718_010715.jpg

     

    Card cooling. New brain installed.

    20160718_013837.jpg

     

    So it actually works with the 980m vBIOS. I tried modding too soon. I just needed to reinstall the driver. I only ran a very lightweight render test because right now the card is only running on 2 phases. I'm pulling the phase driver from my 980m now to get the 3rd phase back up.

    !!!!!!!!!!!!.png

     

     

    Follow the rest of the discussion here:

     


    8 people like this


    User Feedback




    so your telling me that the 980m cores we get are the worst cores that would have been a 970?

    an they call 800€ for it?!

     

    i was really thinking we would get the very good chips due to ... "laptop" and thats what makes the price.

     

    so basically gaming laptops r a goldmine for the manufactors?! worst quality for insane money?!

    1 person likes this

    Share this comment


    Link to comment
    Share on other sites

    OH YEAH, way to go KHENGLISH!!!!!!!! 

     

    you are now officially a mad scientist!!!!

     

    YOu are officially REDDIT famous! (saw this over reddit too!)

    Edited by jcagara08

    Share this comment


    Link to comment
    Share on other sites

    Brother Khenglish, this is amazing.

    So, in essence, you may be running a world-first MXM 980 desktop core?

    So... suppose you installed that card in some really ancient machine...

    Share this comment


    Link to comment
    Share on other sites
    Guest DCMAKER

    Posted

    Can you put a 980TI core in this or no? Just Curious. Will you try this with the next gen pascal cards?

    Share this comment


    Link to comment
    Share on other sites

    980 TI (GM200) is built for 384 bits and the core package is far too large to fit on a 980m. GTX 1070 (GP104) cores have a smaller package than GM204 and are also incompatible. GTX 1080 is the wrong size and also wrong memory interface.

     

    The only other compatible core is the GK104. Conversely this means the 680m, 780m, and 880m can take a 980 core if someone wrote their own vBIOS for it. The 680m kepler vBIOS has no chance at running a maxwell core. A 980m vBIOS won't work due to a completely different core VRM & power monitoring module. In addition to the vBIOS write I know at least one hardware strap would have to be changed as well.

    Edited by Khenglish

    Share this comment


    Link to comment
    Share on other sites
    Guest DCMAKER

    Posted

    3 hours ago, Khenglish said:

    980 TI (GM200) is built for 384 bits and the core package is far too large to fit on a 980m. GTX 1070 (GP104) cores have a smaller package than GM204 and are also incompatible. GTX 1080 is the wrong size and also wrong memory interface.

     

    The only other compatible core is the GK104. Conversely this means the 680m, 780m, and 880m can take a 980 core if someone wrote their own vBIOS for it. The 680m kepler vBIOS has no chance at running a maxwell core. A 980m vBIOS won't work due to a completely different core VRM & power monitoring module. In addition to the vBIOS write I know at least one hardware strap would have to be changed as well.

    I was referring to when pascal comes out and switching 1080m with a 1080...like you did here.

     

    K thats what i thought the 980TI is too different :'(

    Share this comment


    Link to comment
    Share on other sites
    6 minutes ago, Guest DCMAKER said:

    I was referring to when pascal comes out and switching 1080m with a 1080...like you did here.

     

    K thats what i thought the 980TI is too different :'(

     

    I tried putting a GDDR5 4890 core on a GDDR3 4850m once. It was not compatible despite the BGAs on both cores looking like they should be. There will be differences in the memory interface that would prevent the 1080 core from running on the 1080m.

     

    The only possible mod I see is moving a 1070m core onto a 1070 desktop for hwbot records.

    Share this comment


    Link to comment
    Share on other sites
    Guest DCMAKER

    Posted

    1 hour ago, Khenglish said:

     

    I tried putting a GDDR5 4890 core on a GDDR3 4850m once. It was not compatible despite the BGAs on both cores looking like they should be. There will be differences in the memory interface that would prevent the 1080 core from running on the 1080m.

     

    The only possible mod I see is moving a 1070m core onto a 1070 desktop for hwbot records.

    What changed between 980m/980 and 1080m/1080?

    Share this comment


    Link to comment
    Share on other sites
    On 16.08.2016 at 10:53 AM, Khenglish said:

    Its pretty close to a regular card. The lack of memory clocks hurts. 16078 GPU score firestrike, and 21748 GPU score 3dm11.

     

    http://www.3dmark.com/3dm/13958342

     

    http://www.3dmark.com/3dm11/11477667

     

     

    WOw!!!!! niceeeeeeeeee!!!!!!! super!!!!! You think it will should work with 980M in SLI mode? if i will do solder two chips on each cards?

    Your card working WITHOUT any troubles and problems? How much wattage i need in my powersupply if i will do same but with two cards?

    Share this comment


    Link to comment
    Share on other sites

    If two cards were modded I would expect SLI to work.

     

    If you were using two cards you will want to have a dual power supply setup. I have hit 280W power draw out of the card at 1.2V 1502MHz.

     

    With the 1080 coming out though such a mod has greatly lost its practicality. I only did it because I had a 980m with a dead core that I knew was otherwise fine. 980 cores are not easy to find, and I got very lucky to get one for only $143.

    Share this comment


    Link to comment
    Share on other sites
    On 16.08.2016 at 10:53 AM, Khenglish said:

     

    are you think i can easy will install two 1080 gtx cards into my system p870dm?

    Share this comment


    Link to comment
    Share on other sites
    2 hours ago, Kolich said:

    are you think i can easy will install two 1080 gtx cards into my system p870dm?

     

    Two won't fit. You can only fit one, and that will need new heatsinks.

    Share this comment


    Link to comment
    Share on other sites

    wow, i've recently read on this forum that someone heated there gpu in the oven because it wasn't working and thought it was a joke,

    it really isn't. lol GOOD JOB

    Share this comment


    Link to comment
    Share on other sites
    On 16.08.2016 at 10:53 AM, Khenglish said:

     

    are you think i can easy will install two 1080 gtx cards into my system p870dm?

    I think it will be good if i will install two 1080M and than put core from 1080GTX? it will fit?

    Share this comment


    Link to comment
    Share on other sites

    The original p870dm cannot fit 2 1080s, and may not even be able to fit one. If it could fitnone you would need new heatsinks.

     

    There is not much reason to change the core. Both have a 2560 cuda core count.

    Share this comment


    Link to comment
    Share on other sites

    Good job, we once reflowed a laptop MB in an oven at work, it fixed the problem but the laptop allway's had a hint of burn odor coming off it.  I suspect we had too much heat.

    Share this comment


    Link to comment
    Share on other sites



    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!


    Register a new account

    Sign in

    Already have an account? Sign in here.


    Sign In Now

  • Similar Content

    • By Adraesh
      Hi all,
       
      I own an Alienware MX17 R4 (from 2012/2013 don't remember well) running with the official A12 bios and I am planning to change the discreet NVIDIA 675M because unfortunately it died.
       
      First and as I don't have so much money to spend on this, I am planning to change it by a 680M 4GO. Is this one a "good" choice ? Don't hesitate to propose me another GPU.
      Is the installation of a 680M 4GO will require a unlocked bios version ? (A11 unlocked ?).
       
      Do u know any website on which I can buy a 680M 4GO, I am really not a big fan of ebay...
       
      Thank you in advance.
    • By Madoka G
      Hello everyone,
       
      I am an M17X R4 owner running WIN10 AND currently upgrading my laptop from an 7970m to a 970m.
      I have already successfully transferred my 970m to the motherboard and windows has detected the the gpu in device manager, optimus mode is enabled and the intel graphics are also detected.
      The laptop is currenlty booting in full UEFI mode and running bios A11 stock.
       
      Currently I am experiencing issues with getting the 970m to run, my Intel graphics keep running only but i can see the 970m is installed and the settings can be changed from the nvidia control panel and can be benchmarked using FurMark.
      I managed to modify an nvidia driver using the device hardware ID's (not sure if that is the issue) but i think the main reason why the 970m is being used is because i require an unlocked Bios to change which GPU it runs on?
       
      Everything up until this point has been an interesting learning curve and i feel i have almost cracked it. 
      what do you all think, any advice would be much appreciated by myself and my laptop
      Im no expert on the matter but if anyone wants any advice from my experiences above i would be happy to help the best i can. Thx
       
       
    • By AleksR17x
      Updated inf specific for Alienware M17x R4 - 880 GTX
       
      384.76 - W10 64bit
       
      1198 - SUBSYS 057B1028
       
      Modded/updated Vbios highly Recommended
       
      Instructions (for Win 10 64bit):
       
      DDU have to be disabled (Windows Test Mode in right bottom corner)
       
      1.: Download NVIDIA Driver package
       
      http://us.download.nvidia.com/Windows/384.76/384.76-notebook-win10-64bit-international-whql.exe
      Let it unpack and cancel installation
       
      2.: Go to C:\NVIDIA\DisplayDriver\384.76\Win10_64\International\Display.Driver
      overwrite the supplied nvdmi.inf over the one in the folder.
       
      3.: Run setup.exe and let it install the new driver.
       
      nvdmi.rar
    • By FPSmadMEDIC
      Thanks to Joedirt recently for setting me up with a custom vBIOS for my ASUS ROG G750JM. I am having a little trouble figuring out how to adjust my voltage. Once the vBIOS is flashed the voltage is baked in so the slider does nothing. Joedirt took care of me plenty so I dont want to go hounding him for this advice and would rather ask the community. I can get the voltage to run at 0.987v or 1.193v one is not enough and the other is too much even with no OC it gets the GPU too hot too fast I would like to learn how to adjust the Voltage Table so that I can get 1.06-1.1v on my chip so I can OC and still not over heat. The stock vBIOS set it at 1.034v which I could OC my Memory +450mhz and my core +135mhz no problem. I would like to run at +280mhz or 1300mhz core and +395mhz Memory or 1450mhz Memory but that is not stable at 0.987v and gets to the 92c temp limit super fast with 1.193v. . .
    • By 09williamsad
      I have  Lenovo e540 with an EXP GDC Beast 8.4d and a GTX 660.
      The nvidia display driver crashes with a black screen for a few seconds when launching some games or when the egpu is used for a while, it sometimes will recover
      My external power supply is 350W and the card has a power consumption of 140W so I do not think this is a power issue.
       
      "Event 4101
      Display driver nvlddmkm stopped responding and has successfully recovered."
       
      What I have tried:
      A GTS450 instead of the GTX 660.
      Setting the PCIe port to Gen 1 in egpu setup.
      Setting the PCIe port to Gen 2 in egpu setup.
      Setting the Nvidia power management to maximize performance in the Nvidia control panel.
      Going back to the driver version 327.23, I got a blue screen instead of a crash when using this version.
      Disabling HDMI audio in the Nvidia control panel.
       
      Any suggestions or advice on this issue?
       
      Thank you