Khenglish Posted May 25, 2015 Share Posted May 25, 2015 So I managed to pull off upgrading a graphics card by changing the processor core on the card, while keeping the same card. The card is an 11 year old Mobility Radeon 9600 Pro Turbo, found in the Dell Inspiron 8600. I replaced the core with one from a Desktop 9600XT, which is similar, but has a low K dielectric in the interconnect stack for much higher frequencies, and has better z compression. I did not know how to test if the improved z compression had an impact, but the maximum stable overclock improved from 360MHz to 553.5MHz. So who cares since performance still sucks you say? True, the performance went from abysmal to slightly less abysmal, but there were several major findings: 1. The original core was clearly marked as a Mobility Radeon 9600, while the upgraded core was marked as a 9600 XT. This means that despite core markings, parts can be interchangeable, even between mobile and desktop cards. 2. No BIOS or hardware ID change was required. The new core simply worked. 3. Windows and the BIOS noticed no change. 4. The new core clocks 53.75% faster than the old core. So in short, the upgrade worked with no hitch, and performed exactly as a card built for the new core would, and no extra work was needed besides physically switching the core. This means that much more modern core swaps, such as from a 980m to a 980 core for 33% more CUDA cores, would likely work with no extra effort besides switching the core out. The card Prior to the final bake in the toaster oven: What the card looked like in January. I posted this thread on the card you see below: Benchmarks. CPU is a Dothan Pentium m at 2.4GHz. 3dm01 stock 11928: 3dm01 max stable overclock original core 13056: 3dm01 slightly unstable overclock (mem too high) new core 15029: 3dm03 stock 3160: 3dm03 max overclock old core 3538: 3dm03 max overclock new core 4289: 3dm05: Generic VGA video card benchmark result - Intel® Pentium® M processor 1.80GHz,Dell Computer Corporation 0Y4572 3dm06: Generic VGA video card benchmark result - Intel® Pentium® M processor 1.80GHz,Dell Computer Corporation 0Y4572 So the plan now is to try a core swap on something more modern. I will next try a GF114 core from a 560 ti onto a 485m, which is supposed to have a GF104 core. If that works, next is a GF108 core to a GF117 core on an NVS 5200m, which is a full node die shrink of the same core. I also might throw a 4870 core onto a 7970m with a dead core (BGAs appear to match. filtering on the backside of the cards are identical). If that works with a 3 generation gap, then pretty much anything will. I also want to find a Tonga core to put on one of the 2 dead 7970m that svl7 sent me. 9 Quote Link to comment Share on other sites More sharing options...
Guest Posted May 25, 2015 Share Posted May 25, 2015 Nice! This is pretty amazing. Good job. Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 25, 2015 Author Share Posted May 25, 2015 Well this is interesting. All testing was done on an external screen because the internal was not working. I thought this was because I had gotten solder flux and other crud in the LVDS connector, but now that it is clean, the internal screen is still black. It seems possible that LVDS is not routed on the 9600XT packaging. Quote Link to comment Share on other sites More sharing options...
H658tu Posted May 25, 2015 Share Posted May 25, 2015 Makes sense; why support something that would never be used on a desktop? Bit too much for a 9600 system, perhaps, but could use a VGA/HDMI -> LVDS adapter. May not be very useful here, yet if the other bga swaps run into the same issue ... Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 25, 2015 Author Share Posted May 25, 2015 Makes sense; why support something that would never be used on a desktop? Bit too much for a 9600 system, perhaps, but could use a VGA/HDMI -> LVDS adapter. May not be very useful here, yet if the other bga swaps run into the same issue ...Yeah I could get an adapter. If I figure out the LVDS connector I could probably vastly simplify it to use the internal inverter, but with something so old it's not worth the time. The only unique thing this laptop offers is a floppy drive, but you can even get Samsung phones to read USB floppy drives. As for modern systems, most use optimus/enduro which does not require a single display output, so this is really only a problem on systems with direct output only with an LVDS internal screen. eDP likely is unaffected since desktop cards do have displayport outputs. SLI systems could use modded cards in the secondary slot.A 560 ti is arriving soon (dead and dirt cheap of course). It's either going onto a 485m or a 580m. Both need cores. The 485m is ready to go with the core and solder removed, while I still need to pull the core off the 580m (I got the dead 580m to put the core onto the 485m only to find it was chipped...). The 485m should run trouble free on my P150EM since it is an optimus system outside of possible ACPI issues. I can try it in a friend's P150HM to see if Nvidia also cuts out LVDS on their desktop chips.Hmm just saw that there is a 256-bit version of tahiti that seems to match the 7970m bga. It's only 20% more cores though and that unused chunk will be sucking up power. Tonga would be much better, but is much more expensive. Quote Link to comment Share on other sites More sharing options...
deadsmiley Posted May 26, 2015 Share Posted May 26, 2015 I like it! I have a lot of respect for the skill it takes to pull this off. Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 26, 2015 Author Share Posted May 26, 2015 I like it! I have a lot of respect for the skill it takes to pull this off.When the screen lit up with no artifacts I was surprised.Also I successfully pulled the core and cleaned the PCB of a Clevo 7970m. I'm now trying to find a 7870XT, R9 285, or m295x for cheap for a core swap. Unfortunately people are only putting WORKING cards on ebay... not $30 dead ones. 1 Quote Link to comment Share on other sites More sharing options...
Moderator angerthosenear Posted May 26, 2015 Moderator Share Posted May 26, 2015 That's so awesome you managed to pull it off! I wonder how things would handle with a 980 core on a 970 board with the memory issues. Will the 285 on your 7970m be the next big test? Or do you have other plans before jumping to that point? Coming soon from Tech|Inferno : Titan XM Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 26, 2015 Author Share Posted May 26, 2015 That's so awesome you managed to pull it off!I wonder how things would handle with a 980 core on a 970 board with the memory issues. Will the 285 on your 7970m be the next big test? Or do you have other plans before jumping to that point? Coming soon from Tech|Inferno : Titan XM I don't think a 980 core would work on a 970m board since the memory is only 192 bit, while the 980 expects 256 bit. Next will be a 560 ti core on a 485m. It should arrive today or tomorrow. The 485m also had a single memory chip replaced with one from a 680m, which has double the memory. It will be interesting to see how much memory is reported if it works. I expect still 2GB, not 2.125GB. An R9 285 would be nice on the 7970m. Soon 300 series cards should be out for a full 2048 shader option, or at least drive down R9 285 prices. Tahiti LE is still a possibility, although it will be slower and hotter. Quote Link to comment Share on other sites More sharing options...
ice.cold Posted May 26, 2015 Share Posted May 26, 2015 Very interesting that your core swap worked without having to modify anything else.Some time ago I tried flashing a 770m vbios onto a 765m MXM card. I had to edit the soft straps to match a 770m, as the drivers won't use the P0 state otherwise, but after that the card functioned just as it did before (using the lower clocks in the 770m vbios). Unfortunately it didn't unlock any extra shaders, but it does support the theory that a core swap on a newer nvidia card would work without extra modification. This also suggests that a different memory configuration (192-bit to 128-bit) might not matter either.I look forward to seeing your results with the 560ti core to 485m swap. Quote Link to comment Share on other sites More sharing options...
Moderator angerthosenear Posted May 26, 2015 Moderator Share Posted May 26, 2015 I don't think a 980 core would work on a 970m board since the memory is only 192 bit, while the 980 expects 256 bit.Next will be a 560 ti core on a 485m. It should arrive today or tomorrow. The 485m also had a single memory chip replaced with one from a 680m, which has double the memory. It will be interesting to see how much memory is reported if it works. I expect still 2GB, not 2.125GB.An R9 285 would be nice on the 7970m. Soon 300 series cards should be out for a full 2048 shader option, or at least drive down R9 285 prices. Tahiti LE is still a possibility, although it will be slower and hotter.Oh, I meant 980 on 970 both desktop versions. But yeah, with the memory bandwidth issues, you'll probably right in that it won't work.Look forward to your next core swap. Are you using a re-work station or just the glorious oven?What about power supply issues? I know MXM can pour quite a bit of wattage through it, but in the higher tier cards won't this start to become more of an issue? Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 26, 2015 Author Share Posted May 26, 2015 I don't see much point on core swapping a 980 onto a 970 since you already have the 980. I suppose it would work.So far I have been doing a mix of toaster oven and heat gun. I have access to a reflow oven, but I don't want to worry about the fast heating rate destroying the die packaging so I've just been using the toaster.The P150EM's MXM circuitry is rated for over 200W and overall power at 400W. I can easily raise it if I start getting close, so power draw is no problem. Quote Link to comment Share on other sites More sharing options...
Moderator angerthosenear Posted May 26, 2015 Moderator Share Posted May 26, 2015 I don't see much point on core swapping a 980 onto a 970 since you already have the 980. I suppose it would work.So far I have been doing a mix of toaster oven and heat gun. I have access to a reflow oven, but I don't want to worry about the fast heating rate destroying the die packaging so I've just been using the toaster.The P150EM's MXM circuitry is rated for over 200W and overall power at 400W. I can easily raise it if I start getting close, so power draw is no problem.Yeah, certainly no point in doing it, but would be interesting seeing if the 3.5GB VRAM issue would go away.Are you having to use stencils and all that fun stuff to clean up the pads for the core swap?Glad power isn't and issue, what about thermals? I know you did that cool radiator mod, so that'll help some. I guess we'll find out the more you do! Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 26, 2015 Author Share Posted May 26, 2015 I think it would. 980 has the full L2 cache enabled. I'm under the impression that video BIOS is mostly for setting up the VRMs, communication interfaces, and letting the system know what the GPU is capable of, not really checking or caring what GPU it actually is. @svl7 anything to add on this?I just ordered a stencil for the 7970m. I already have one for GF104/GF114 chips.My only cooling concern is 24/7 use avoiding the highest fan speed, which is LOUD. Benchmarks will be fine. I could make a mega heatsink from 2 8970m P370SM heatsinks, but that would cost a lot to get both. It would likely cool the core ~7C better and much better VRM and memory temps. Quote Link to comment Share on other sites More sharing options...
Moderator BAKED Posted May 27, 2015 Moderator Share Posted May 27, 2015 Awesome work dude! Quote Link to comment Share on other sites More sharing options...
Guest Posted May 27, 2015 Share Posted May 27, 2015 A GTX970 core on GTX980M MXM board please, so we can have 1664 Shaders already this year... Quote Link to comment Share on other sites More sharing options...
Khenglish Posted May 27, 2015 Author Share Posted May 27, 2015 A GTX970 core on GTX980M MXM board please, so we can have 1664 Shaders already this year... Ha, well i I try that it's going to be a 980 core since the 980m is already so damn expensive. I'm having issues with my solder flux burning and forming a barrier when working with the higher temps of unleaded solder so that may take a while. Quote Link to comment Share on other sites More sharing options...
Guest Posted May 27, 2015 Share Posted May 27, 2015 A 980M board won't be able to fuel a full GM204 to the extent that actual results make it worth...maybe if you can dig up a 100%+ ASIC chip... Here try this flux: Home | Löthonig® - www.loethonig.de Picked it up during my last Europe trip...the best I have EVER used, its basically soldering itself. This shop ships Stateside: http://www.ebay.de/itm/Lothonig-fur-schwerlotbare-Metalle-z-B-oxydierte-Teile-hochwertige-Qualitat-/261054889412?pt=LH_DefaultDomain_77&hash=item3cc8157dc4 Quote Link to comment Share on other sites More sharing options...
H658tu Posted May 27, 2015 Share Posted May 27, 2015 it's basically soldering itself. Thanks, will try as well. Having a little trouble with a project (leadfree = $%#^@!). Quote Link to comment Share on other sites More sharing options...
H658tu Posted June 21, 2015 Share Posted June 21, 2015 That Löthonig flux is good stuff; only thing that worked, in fact. Did have to run iron at 450°C though, which did result in some burn off. Better try sub-400°C first; they claim no burn/barrier at that temperature. Quote Link to comment Share on other sites More sharing options...
Clyde Posted June 29, 2015 Share Posted June 29, 2015 I did not notice this thread earlier. In brief. After my experience with the GK104 chip exchange in damaged K5000M for a GTX680 desktop chip (1203 A2), it seems to me that the number of total GPU active cores does not depend on the same chip, which in GK104 is almost identical for the desktop and for the mobile GPUs. It seems to me that it's only bios-power section depends. 1 Quote Link to comment Share on other sites More sharing options...
ice.cold Posted June 29, 2015 Share Posted June 29, 2015 I did not notice this thread earlier.In brief. After my experience with the GK104 chip exchange in damaged K5000M for a GTX680 desktop chip (1203 A2), it seems to me that the number of total GPU active cores does not depend on the same chip, which in GK104 is almost identical for the desktop and for the mobile GPUs. It seems to me that it's only bios-power section depends.I don't think the vbios is the only thing that controls the number of active shaders; I posted earlier about crossflashing my gtx765m to a gtx770m, which didn't unlock any extra shaders. I think the chip still needs to have the shader cores enabled in the first place, otherwise the extra shaders should have been unlocked.It's strange how your screenshots only show 960 shaders for the K5000m, as the normal chip has 1344 shaders, and the gtx680 should have 1536. If it was being limited by the vbios, then it should show 1344 for the transplanted chip, not 960. I guess one way to properly test if the vbios limits the shaders is to crossflash a high-end card with a lower-end vbios and see if the shader count changes. 1 Quote Link to comment Share on other sites More sharing options...
Clyde Posted July 1, 2015 Share Posted July 1, 2015 I know this is not the bios only but its on connection with the power section. And what if that board id, or some stupid power phases jumper, decides the number of cores running?Perhaps my damaged K5000M is not damaged but only is K4000M board crossflashed on K5000M? 1 Quote Link to comment Share on other sites More sharing options...
ice.cold Posted July 3, 2015 Share Posted July 3, 2015 That's a possibility. Have you tried flashing a stock K5000M vbios onto it? If it really is a K4000M board, then the drivers won't let the card enter the P0 clock state, with a non-matching vbios that isn't modified. Quote Link to comment Share on other sites More sharing options...
Clyde Posted July 5, 2015 Share Posted July 5, 2015 A little too late. As I have more time I will try to put together and test. From what I remember this K5000M/K4000M with a GK104 chip from the GTX680 (desktop) had still a K4000M performance (6-7000 points in 3dmark11), same like with early 1149 A1 chip with which it got. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.