Jump to content

Geforce GTX 680MX Discussion


ssj92

Recommended Posts

Its not about the GPU itself though. It's the price point. If the new chip is supported on MXM, the old chip may drop in price a bit. (Unless they're really loving the price point)

I had plenty enough to purchase a nice and cheap build. (The reseller I got my P150EM from is ebay only; computerupgradeking)

Link to comment
Share on other sites

Waiting for AMD's reply... probably a 7990?

Sent from my GT-I9300 using Tapatalk 2

If that would be the case, that would require AMD to make a whole new architecture... which I doubt they'll make, so the only logical thing to do for them is basically release a OC'd 7970M and call it a 7990M (just like nVIDIA did with the 680MX... lol)

Link to comment
Share on other sites

so the only logical thing to do for them is basically release a OC'd 7970M and call it a 7990M (just like nVIDIA did with the 680MX... lol)

680mx isn't an overclocked 680m, it's a different chip, it's the same as the desktop GTX 680 uses and has more shaders than the 680m / 670

Link to comment
Share on other sites

  • 1 month later...

Just found an OEM barebone MSI vendor that includes GTX 680MX instead of GTX 680M. That could mean that it's actually a replacement for 680M :(

So it's not gonna be exclusively for Apple.

Here is the link Firebat_F740mx..

Since Viooo is a chinese OEM vendor you should expect to see chinese alphabets, you can click the second tab and that should get you into the specification page.

I am wondering if that 180W brick could supply adequate amount of power needed to feed that new beast LOL

Link to comment
Share on other sites

It's the same chip but more shaders activated :P

They are lasercut I think?? So no way to turn an gtx680m chip(670) to an gtx 680mx(680), otherwise people already would have done that with those desktop cards(desktop community is way bigger).

Link to comment
Share on other sites

Rebadge or not, the 680M is a beast! Can't go wrong with awesome.

<embed src="http://www.youtube.com/watch?v=KBDHBYI7TKs?version=3&hl=en_US&rel=0" type="application/x-shockwave-flash" width="853" height="480" allowscriptaccess="always" allowfullscreen="true">

Link to comment
Share on other sites

It is basically the same as the jump for desktop 670 vs 680 except no clock advantage for the 680mx. Expect about 5% performance increase over the 680m, and if it is runs hotter or eats more power it might not OC as well, making it totally possible for the OC 680m to be faster just like in the desktop space for 670 vs 680.

Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking.

Link to comment
Share on other sites

Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking.<!-- google_ad_section_end -->

So it means High Voltage= High temps... If this is going to be available for laptops then we need watercooling for the heat to dissipate and a more high wattage for ac adapter.... Or a new version laptop to dissipate the heat..... oh well I am happy with my 680m. Performs like a desktop gpu :21_002:

Link to comment
Share on other sites

Still runing an 2920xm or already using the 3920xm?

The 3920XM has not arrived yet, so that run was with the 2920XM. And, it was only overclocked to 3.5GHz on the CPU. I forgot I had it set on my lowest ThrottleStop profile until after the video was uploading, LOL. I meant to run it at 4.7GHz.

Link to comment
Share on other sites

  • 2 weeks later...
Wrong, the major difference is the high voltage memory at 1.25ghz vs the 900mhz on the 680M, this also means it will likely clock up to 1.5-1.6ghz, this will make a huge difference in certain situations and especially when overclocking.

So the 680MX is not going to be a mere overclocked 680M but will be able to clock up to 1.5ghz and above? Has the great Nvidia Rebadge cycle been broken??

Link to comment
Share on other sites

well, I know my p170hm will power this card, and if it comes out for notebooks in the MXM 3.0b form factor I might get it. What I am really looking forward to is the 700 series flagship card. Hopefully it won't be another one of those things nvidia did with the 500 series

Link to comment
Share on other sites

I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI.

Link to comment
Share on other sites

I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI.

Flash it with one of svl's vBIOS's then OC and your opinion will change I guarantee it!

Link to comment
Share on other sites

  • Founder
I just purchased a Sager NP9370 with a single GTX 680M and am not impressed so far. I was expecting there would be nothing it couldn't handle, but am saddened to find lag and frameskipping in newer games at high settings. I'm looking into overclocking the GPU at this point. If the MX isn't slated for the near future, then I may try adding a second 680M in SLI.

You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.

Sent from my GT-N7000

Link to comment
Share on other sites

You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.

Everything I read prior to purchasing indicated that a second 680m was overkill. However the option should still be there to add the second card as I am within Sager's 30-day satisfaction guarantee. I will be flashing svl7's vbios in a bit and o/c. From what I read here it should make a major difference.

Link to comment
Share on other sites

  • 2 weeks later...
You should have gotten sli to begin with for a notebook like that. And like the guy above said, use a custom vbios and oc.

Sent from my GT-N7000

^^^That^^^

I've just ran crysis benchmark tool and in 1900x1080 on MAX everything ~65fps average. 860m SLI FTW.

Link to comment
Share on other sites

  • 4 weeks later...

so.. as per wikipedia (still subject to change) specs showing 680mx becomes the next 780m. I think I just lost a load of respect for nvidia, giving us something half a year old that Apple already has their hands on while charging us premium, possibly the same TDP as well, load of bs man..

look up 680mx and 780m Comparison of Nvidia graphics processing units - Wikipedia, the free encyclopedia

Link to comment
Share on other sites

Almost cry after reading unityole's post. Nvidia just took the easy way out. I pray that at least they improve the cooling for the 780m. But shame on them any way. What's next 5% improve on 780m and call it 780mx exclusive to apple again? Then 880m for everyone?

Link to comment
Share on other sites

No matter what it's called, if (after Apple) the card hit the market in the price of HD8970M (same PCB could reduce the cost of production) I certainly would not hesitate.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.