lonesyndal Posted November 24, 2012 Share Posted November 24, 2012 Its not about the GPU itself though. It's the price point. If the new chip is supported on MXM, the old chip may drop in price a bit. (Unless they're really loving the price point)I had plenty enough to purchase a nice and cheap build. (The reseller I got my P150EM from is ebay only; computerupgradeking) Quote Link to comment Share on other sites More sharing options...
jaug1337 Posted November 28, 2012 Share Posted November 28, 2012 Waiting for AMD's reply... probably a 7990?Sent from my GT-I9300 using Tapatalk 2If that would be the case, that would require AMD to make a whole new architecture... which I doubt they'll make, so the only logical thing to do for them is basically release a OC'd 7970M and call it a 7990M (just like nVIDIA did with the 680MX... lol) Quote Link to comment Share on other sites More sharing options...
svl7 Posted November 28, 2012 Share Posted November 28, 2012 so the only logical thing to do for them is basically release a OC'd 7970M and call it a 7990M (just like nVIDIA did with the 680MX... lol)680mx isn't an overclocked 680m, it's a different chip, it's the same as the desktop GTX 680 uses and has more shaders than the 680m / 670 Quote Link to comment Share on other sites More sharing options...
Meaker Posted December 1, 2012 Share Posted December 1, 2012 It's the same chip but more shaders activated Quote Link to comment Share on other sites More sharing options...
minominti Posted January 4, 2013 Share Posted January 4, 2013 Just found an OEM barebone MSI vendor that includes GTX 680MX instead of GTX 680M. That could mean that it's actually a replacement for 680M So it's not gonna be exclusively for Apple. Here is the link Firebat_F740mx.. Since Viooo is a chinese OEM vendor you should expect to see chinese alphabets, you can click the second tab and that should get you into the specification page. I am wondering if that 180W brick could supply adequate amount of power needed to feed that new beast LOL Quote Link to comment Share on other sites More sharing options...
naldor Posted January 5, 2013 Share Posted January 5, 2013 It's the same chip but more shaders activated They are lasercut I think?? So no way to turn an gtx680m chip(670) to an gtx 680mx(680), otherwise people already would have done that with those desktop cards(desktop community is way bigger). Quote Link to comment Share on other sites More sharing options...
curl2k1 Posted January 6, 2013 Share Posted January 6, 2013 Somehow, I get the feeling this will be the "next great rebadge" for nvidia in the spring Quote Link to comment Share on other sites More sharing options...
Guest Posted January 6, 2013 Share Posted January 6, 2013 Rebadge or not, the 680M is a beast! Can't go wrong with awesome.<embed src="http://www.youtube.com/watch?v=KBDHBYI7TKs?version=3&hl=en_US&rel=0" type="application/x-shockwave-flash" width="853" height="480" allowscriptaccess="always" allowfullscreen="true"> Quote Link to comment Share on other sites More sharing options...
naldor Posted January 6, 2013 Share Posted January 6, 2013 Still runing an 2920xm or already using the 3920xm? Quote Link to comment Share on other sites More sharing options...
Meaker Posted January 6, 2013 Share Posted January 6, 2013 It is basically the same as the jump for desktop 670 vs 680 except no clock advantage for the 680mx. Expect about 5% performance increase over the 680m, and if it is runs hotter or eats more power it might not OC as well, making it totally possible for the OC 680m to be faster just like in the desktop space for 670 vs 680.Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking. Quote Link to comment Share on other sites More sharing options...
omega939 Posted January 6, 2013 Share Posted January 6, 2013 Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking.<!-- google_ad_section_end --> So it means High Voltage= High temps... If this is going to be available for laptops then we need watercooling for the heat to dissipate and a more high wattage for ac adapter.... Or a new version laptop to dissipate the heat..... oh well I am happy with my 680m. Performs like a desktop gpu Quote Link to comment Share on other sites More sharing options...
Guest Posted January 6, 2013 Share Posted January 6, 2013 Still runing an 2920xm or already using the 3920xm?The 3920XM has not arrived yet, so that run was with the 2920XM. And, it was only overclocked to 3.5GHz on the CPU. I forgot I had it set on my lowest ThrottleStop profile until after the video was uploading, LOL. I meant to run it at 4.7GHz. Quote Link to comment Share on other sites More sharing options...
zayin101 Posted January 14, 2013 Share Posted January 14, 2013 Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking.So the 680MX is not going to be a mere overclocked 680M but will be able to clock up to 1.5ghz and above? Has the great Nvidia Rebadge cycle been broken?? Quote Link to comment Share on other sites More sharing options...
Khenglish Posted January 15, 2013 Share Posted January 15, 2013 So when are one of you going to get a 680mx bios and flash it on your 680m? Quote Link to comment Share on other sites More sharing options...
ajbutch123 Posted January 17, 2013 Share Posted January 17, 2013 well, I know my p170hm will power this card, and if it comes out for notebooks in the MXM 3.0b form factor I might get it. What I am really looking forward to is the 700 series flagship card. Hopefully it won't be another one of those things nvidia did with the 500 series Quote Link to comment Share on other sites More sharing options...
datashifter Posted January 22, 2013 Share Posted January 22, 2013 I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI. Quote Link to comment Share on other sites More sharing options...
grechie Posted January 22, 2013 Share Posted January 22, 2013 I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI.Flash it with one of svl's vBIOS's then OC and your opinion will change I guarantee it! Quote Link to comment Share on other sites More sharing options...
Founder Brian Posted January 22, 2013 Founder Share Posted January 22, 2013 I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI.You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.Sent from my GT-N7000 Quote Link to comment Share on other sites More sharing options...
Zuppo Posted January 22, 2013 Share Posted January 22, 2013 I guess 680mx available only in mac machines? Really a BEAST card, would love to have such thing it barebone version Quote Link to comment Share on other sites More sharing options...
datashifter Posted January 22, 2013 Share Posted January 22, 2013 You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.Everything I read prior to purchasing indicated that a second 680m was overkill. However the option should still be there to add the second card as I am within Sager's 30-day satisfaction guarantee. I will be flashing svl7's vbios in a bit and o/c. From what I read here it should make a major difference. Quote Link to comment Share on other sites More sharing options...
zayin101 Posted January 25, 2013 Share Posted January 25, 2013 So is the 680mx going to be non-optimus only? If so and I have a m17x with the 120hz screen with optimus already disabled I wonder if it will be compatible? Quote Link to comment Share on other sites More sharing options...
Stank0 Posted February 6, 2013 Share Posted February 6, 2013 You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.Sent from my GT-N7000^^^That^^^I've just ran crysis benchmark tool and in 1900x1080 on MAX everything ~65fps average. 860m SLI FTW. Quote Link to comment Share on other sites More sharing options...
unityole Posted March 6, 2013 Share Posted March 6, 2013 so.. as per wikipedia (still subject to change) specs showing 680mx becomes the next 780m. I think I just lost a load of respect for nvidia, giving us something half a year old that Apple already has their hands on while charging us premium, possibly the same TDP as well, load of bs man..look up 680mx and 780m Comparison of Nvidia graphics processing units - Wikipedia, the free encyclopedia Quote Link to comment Share on other sites More sharing options...
Expell666 Posted March 12, 2013 Share Posted March 12, 2013 Almost cry after reading unityole's post. Nvidia just took the easy way out. I pray that at least they improve the cooling for the 780m. But shame on them any way. What's next 5% improve on 780m and call it 780mx exclusive to apple again? Then 880m for everyone? Quote Link to comment Share on other sites More sharing options...
Clyde Posted March 12, 2013 Share Posted March 12, 2013 No matter what it's called, if (after Apple) the card hit the market in the price of HD8970M (same PCB could reduce the cost of production) I certainly would not hesitate. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.