Search the Community
Showing results for tags 'swap'.
Found 3 results
Hey whats up guys and gals? New member but have been researching here for quite some time. I have a... 2011 Alienware M17x R3 i7-2630QM CPU 16GB RAM Nvidia GTX 460M GPU Windows 8.1 Pro x64 I happened to find a good deal on ebay after a heated bidding war lol. So basically I want to upgrade the GPU to get a liitle more performance out of her. A few games have some terrible lag as Im playing. Ive been looking on ebay and doing research on a new one for the past couple weeks. Should I go with a 580m or save the extra money and get a 680m or a 780m? Something along that line. I dont wanna spend too much on a newer GPU. Any opinions would be appreciated!
So I didn't like that the memory on my 980m only clocked to 6.4 GHz after raising the voltage to 1.48V from 1.35V, and wanted my memory to run even faster. I knew someone with a spare 970, so we made a deal where I buy the card, and if it still worked after I switched all the memory chips, he'd buy it back (for reduced amount if it could no longer do 7GHz, but at least 6GHz). Long story short, he bought the card back and I got faster memory. MSI 970 4GB Lightning original memory: Samsung K4G41325FC-HC28 (7GHz rating, 8GHz max overclock) MSI 980m 4GB original memory: Hynix H5GQ4H24MFR-T2C (6 GHz rating, 6.4GHz max overclock) Both cards are GM204 chips. The 980m has one less CUDA core block enabled than the 970, but it has the full 256-bit memory interface and L2 cache with no 3.5GB issues, while the 970 is 224-bit with 1/8th of the L2 cache disabled. Both cards are 4GB with 8 memory chips. I highly suspected this memory swap would work because video cards read literally nothing from a memory chip. There is no asking for what the chip is or even the capacity. They write data to it and hope they can read it back. Memory manufacturer information read by programs like GPU-z isn't even read from the memory. It's set by an on-board resistor. I also had changed multiple memory chips in the past, so was fairly confident I could physically do the job. I started with just one chip switched from both cards. This meant both cards were running a mix of memory from different manufacturers and of different speed ratings, but same internal DRAM array configuration. Both cards worked. Here is a picture of the 980m with one chip switched over: Now how did the cards react? The 980m behaved no differently. No change in max overclock. The 970 though... I expected it to be slower... but... 970 with 1 Hynix chip, 7 Samsung (originally 8 Samsung) 7GHz = Artifacts like a crashed NES even at desktop 6GHz = Artifacts like a crashed NES even at desktop 5GHz = Artifacts like a crashed NES even at desktop 2GHz = Fully Stable, 2d and 3d I didn't try 3GHz or 4GHz, but yeah, HUGE clock decrease. I shrugged though and kept switching all the memory figuring that as long as it worked at any speed, I could figure out the issue later. With switching more chips through 7/8 switched there was no change in max memory clocks. What was really fun was when I had 7/8 chips done. My GDDR5 stencil got stuck and ripped 3 pads off the final Samsung chip. Needless to say there was a very long swearing spree. Looking up the datasheet I found that 2 pads were GND, and a 3rd was some active low reset. Hoping that the reset was unused, I checked the 970's side of the pad and found it was hardwired to GND. This meant the signal was unused. I also got a solder ball on a sliver of one of the GND pads that was left, so I was effectively only missing a single GND connection. I put the mangled 8th chip in the 980m and it worked. Net gain after all of this... 25 MHz max overclock. Something was obviously missing. I figured I would switch the memory manufacturer resistor, hoping that would do something. I saw that Clyde found this resistor on a k5000m, and switching it to the Hynix value from Samsung had no effect for him. He found that for Hynix on the k5000m the value was 35k Ohms, and for Samsung 45k Ohms. I searched the ENTIRE card and never found a single 35k Ohm resistor. Meanwhile the 970 also worked with all 8 chips swapped, at a paltry 2.1 GHz. Then I got lucky. Someone with a Clevo 980m killed his card when trying to change resistor values to raise his memory voltage. His card had Samsung memory. He sent his card to me to fix, and after doing so I spent hours comparing every single resistor on our boards looking for a variation. Outside of VRM resistors there was just a single difference: On his card (his is shown here) the boxed resistor was 20k Ohms. On mine it was 15k Ohms. I scraped my resistor with a straight edge razor (I could not find a single unused 20k resistor on any of my dead boards) raising it to 19.2k, hoping it was close enough. And it was! Prior to this I also raised the memory voltage a little more from 1.48V to 1.53V. My max stable clocks prior to the ID resistor change were 6552 MHz. They are now 6930 MHz. 378 Mhz improvement. Here's a 3dm11 run at 7.5 GHz (not stable, but still ran) http://www.3dmark.com/3dm11/10673982 Now what about the poor 2GHz 970? I found its memory ID resistor too: Memory improved from 2.1 GHz to 6.264 GHz. Surprisingly the memory was slower than it was on the 980m. I expected the 970's vBIOS to have looser timings built in to run the memory faster. As for why the memory was over 100MHz slower than the 980m, 980m actually has better memory cooling than the 970. With the core at 61C I read the 970's backside memory at 86C with an IR thermometer. The Meanwhile the 980m has active cooling on all memory chips, so they will be cooler than the core. In addition, the 980m's memory traces are slightly shorter, which may also help. The 980m at 6.93 GHz is still slower than the 8 GHz that the 970 was capable of with the same memory. I'm not sure why this is. Maybe memory timings are still an issue. Maybe since MSI never released a Hynix version of the 970 meant leftover timings for an older card like a 680 were run, instead of looser timings that should have been used (I know in system BIOS tons of old, unused code get pushed on generation after generation). I don't know, just guessing. Talking to someone who knows how this stuff works would be great. I still want 8 GHz. Some more pics. Here's one with the 970 about to get its 3rd and 4th Hynix chips: Here's my 980m with all memory switched to Samsung. Sorry for the blurriness: So in summary: 1. It is possible to mix Samsung and Hynix memory, or switch entirely from one manufacturer to another, with some limitations. 2. There is a resistor on the pcb that is responsible for telling the GPU what memory manufacturer is connected to it. This affects memory timings, and maybe termination. It has a large impact on memory speed, especially for Hynix memory. This resistor value can be changed to another manufacturer. It is not guaranteed that the vBIOS will contain the other manufacturer's timings. If it does they may not be 100% correct for your replacement memory. 3. If you take a card meant for Hynix memory, you can mix Samsung memory of the same size if it is a faster memory. If the memory is the same speed, the penalty for running Samsung with Hynix timings may hurt memory clocks. 4. If you take a card meant for Samsung memory, you cannot mix any Hynix memory without MAJOR clock speed reductions without also changing the memory manufacturer resistor. It is not guaranteed that the vBIOS will contain the other manufacturer's timings, or if it does 100% proper timings for your specific memory. 5. For Kepler cards the Samsung resistor value is 45k, and for Hynix 35k. For Maxwell cards the Samsung resistor value is 20k, and Hynix 15k. Next up is changing the hardware ID to be a 980 notebook. Clyde also found HWID to have an impact on the number of CUDA core blocks enabled. In about a month I can get a hold of a 970m that someone is willing to let me measure the resistor values on. It has the same pcb as the 980m. Does Nvidia still laser cut the GPU core package? We will find out. Full thread can be found here: https://www.techinferno.com/index.php?/forums/topic/9021-hardware-mod-gtx980m-hynix-to-samsung-memory-swap/#comment-134361
Hello guys hoping you can help me out here. I recently picked up a gigabyte p35x v3 cf2 with a gtx980m i also have a msi gt70-0nd with a gtx675m. Unfortunately the msi's video card burnt out about a year ago but it is a much better machine for me. It stays much cooler and has a better screen/touchpad/keyboard. The p35x also gets way to hot and the motherboard has already died on me once, i believe the thermal solution in the gigabyte just cant handle gaming even with a cooling pad. During gaming i would see 95*+ on the cpu and the whole wrist pad gets very hot. My gt70 would only see 80*+ during heavy gaming and i don't need to carry around a cooling pad with it. Anyway i want to swap the videocard over from the p35x to the gt70, has anyone done this with success? I believe i could just flash the vbios with the MSI version avaliable from tech-powerUP. I am wondering about the heatsink, i work as a designer at a business with several CNC-vertical mills so i could fab something up just wondering if anyone has attempted this as it seems every gtx675 out there has burned itself out by now.