Jump to content

NVIDIA officially states they cut overclocking from mobile GPU's... :(


Recommended Posts

Yes, of all the the other points, those are the ones I care about. Why, because in the end of the day if they happen to fix this "issue" you might as well end-up with nVIDIA GPU once again.

1. How? Tell me, how? That's precisely why I wrote about how development goes. You have to have funds to fund the damn development. HD 7970m was awesome, and wasn't gimped by crappy vRAM like 680m was. Yet most were still all over 680ms. Why the crappy vRAM, well that's nVIDIA we are talking about - hey our 680m with crappy RAM and cut-down core would go tie with 7970m, let's save some for the 780m, they would still buy it either way. I mean c'mon, they could've went flat-out and 680m be what 780m is, but then what? Anyway, even then with relatively equal performance and a bit more when overclocked, nVIDIA was still the one to be considered. And things tanked ever since for AMD. You have to buy a product in order the company to get some and reinvest it. Why would you should care - well look at this thread, that's why.

2. G-Sync? The feature that comes standard with DP 1.4 specs? Again that's nVIDIA we are talking about - money, money, money. "Hey let's milk 200 - 250 dollars more for a display (with a board that's likely in the pocket money range) that would either way come after an year or so, but we wont be getting any then, but now we can throw this as an exclusive feature and etc." LOL! They are not creating anything new, they are just profiting on other people's ideas.*

3. Why not? Really why not? I'll give an example with engines - air cooled engines can go up to 210ºC, but the water cooled ones up to 110ºC. Same goes to horsepower, you look at the specs on paper, but the actual performance may and will differ. What I mean with this? I always said and will continue to do so that TDP is something to base your guess on, but it's not exact science! The actual performance may and will differ especially from one manufacturer to another. And to get to the ºC - if they say that it's right why not? If it fails, they are the ones that would cover it. We are returning back to the main topic - nVIDIA is locking down and limiting in order to cover some f*%#$-up they did. Another notable mention is the entire MXM 8 series line-up. How about that, did we ever got confirmation that they indeed screwed and things will be fixed and people refunded or something?

4. Well if you were to wait the nVIDIA's answer it wouldn't have been such a problem now wouldn't it? Even they had their delays, even tough they are dealing with a lot more resources.

Sure everyone makes bad decisions now and again, but as I said nVIDIA is in my black list for doing stupid things pretty much non-stop.

GameWorks? How about Mantle? AMD was the first one around to come-up with this kind of tech (quick glimpse at GameWorks), so who is playing catch-up? I think nVIDIA was pretty butt-hurt that they haven't came-up with something like that. Mantle was meant to be open source, but whats the point if there's no interest? Well, Intel actually expressed interest, but I seriously doubt this is what AMD was hoping for. I mean, Intel CPUs are more powerful and that's no secret. In the APU market AMD wins because of the better iGP. Now combine Mantle with Intel's strong CPU and average iGP, and you get quite the package. A package that would entirely demolish AMD to be honest. So yeah, I can see why Mantle hasn't picked-up on anything but AMD. Why bother dealing with the competition when they can come-up with more-or-less the same on their own? I mean, it's the idea that's important, of course the realization as well, but they have the money to throw, so they'll fix it one way or another. Then comes DX12.

PhysX? Again - nothing theirs, thankfully they had the money to buy them.

G-Sync? Already said what I had to.

* I give them credit for the MXM tough. Don't know what their intentions were, and if it's really their idea (haven't dug deep enough) but that's the single greatest thing that nVIDIA has ever made. Kudos to them for this. I'm serious.

TL;DR

Can add some more to each point, but is it really necessarily? If don't get my point by now adding more words would do nothing. It already is quite the wall of text, so most people would skip it anyway. Everyone is buying Intel and nVIDIA by default. It takes a miracle to someone actually buy AMD. So how on earth AMD can improve when there are no money coming their way?

I have even less income and what I get for my money is well worth it in my eyes. I can't justify spending a lot more and getting marginally better product. I mean, yeah 980m is what it is - pretty fast, but could've been better AND they cut the overclocking. Tell me, how that is not marginally better product? I actually would consider it worse. You get cut down chip and no overclocking, lovely. Would get the hell out of my 7970m then comes the desktop. (if Clevo doesn't release anything new with AMD chip on it, still have my hopes)

Link to comment
Share on other sites

Wow, they should have told me this a long time ago, ha ha!

No reason not to overclock a decent laptop for benching... been doing it for years now without consequence. No need to overclock for simple gameplay, but life would be far too boring if all there was to do was play games. If you cut corners and buy a junky laptop to save a few bucks, it might get too hot, but thermals are never an issue with a good machine.

It sure did take them a LONG time to discover and fix that "bug" LOL. If they are going to be liars, they should at least try harder. They're not very good at it.

980M SLI - NVIDIA GeForce GTX 980M video card benchmark result - Intel Core i7-4930K,Clevo P570WM powered by PremaMod.com

mXldtCM.jpg

780M SLI - NVIDIA GeForce GTX 780M video card benchmark result - Intel Core i7-3920XM Processor Extreme Edition,Alienware M18xR2

AsqYkzM.jpg

680M SLI - NVIDIA GeForce GTX 680M video card benchmark result - Intel Core i7-3920XM Processor Extreme Edition,Alienware M18xR2

tNxSPW8.jpg

580M SLI - NVIDIA GeForce GTX 580M video card benchmark result - Intel Core i7-2920XM Processor,Alienware M18xR1

ytc7HkP.jpg

Link to comment
Share on other sites

Here was my response to one of the ignorant desktop owners...

http://www.techpowerup.com/forums/threads/nvidia-disables-geforce-gtx-900m-mobile-gpu-overclocking-with-driver-update.209820/page-5#post-3239125

1. nTune has long been discontinued and it doesn't support modern NVIDIA GPUs.

2. GPU boost is not overclocking. Maybe you have to read again what it does and how it works.

3. I don't care what OEMs provide - NVIDIA doesn't mention "overclocking" on their website.

4. Spending $2K on a gaming laptop sounds like a bad joke. $2K could/should be spent on a much better gaming PC (much better cooling, upgradability, acoustics and you can use up to five monitors).

Sigh.

P.S. Why don't you blame Intel and AMD for not providing the means to overclock mobile CPUs. Huh?

NVIDIA's just being stupid. Let's not drag Intel and AMD into the discussion in order to distract us from the matter at hand. And, to set the records straight... mobile Intel CPUs DO OVERCLOCK... Extremely well (pun intended). Only the uninformed folks that live in their own little isolated world think otherwise. It's truly amazing how many people speak about things they have no knowledge of.

Like their desktop counterparts, you have to pay extra for an unlocked CPU. The crappy average processors don't do much in a desktop or a laptop. If you're an average Joe that buys a "gaming laptop" from BestBuy or NewEgg you do not get solid hardware. You get cheap gamer-boy garbage. Pay for average, get average. Pay for awesome, get awesome. There are no free rides in desktop or laptop world.

AW18: Intel Core i7 4930MX @ 5.0GHz: CPU-Z Validator 4.0

nqhu1p-4.png

M18xR2: Intel Core i7 Extreme 3920XM @ 4.9GHz: CPU-Z Validator 3.1

2833018.png

M18xR1: Intel Core i7 Extreme 2920XM @ 4.9GHz: CPU-Z Validator 3.1

2469131.png

And, my laptop with a desktop CPU installed in it...

P570WM: Intel Core i7 4930K @ 4.6GHz: CPU-Z Validator 4.0

qbn00c-4.png

Link to comment
Share on other sites

  • Founder
Yes, of all the the other points, those are the ones I care about. Why, because in the end of the day if they happen to fix this "issue" you might as well end-up with nVIDIA GPU once again.

1. How? Tell me, how? That's precisely why I wrote about how development goes. You have to have funds to fund the damn development. HD 7970m was awesome, and wasn't gimped by crappy vRAM like 680m was. Yet most were still all over 680ms. Why the crappy vRAM, well that's nVIDIA we are talking about - hey our 680m with crappy RAM and cut-down core would go tie with 7970m, let's save some for the 780m, they would still buy it either way. I mean c'mon, they could've went flat-out and 680m be what 780m is, but then what? Anyway, even then with relatively equal performance and a bit more when overclocked, nVIDIA was still the one to be considered. And things tanked ever since for AMD. You have to buy a product in order the company to get some and reinvest it. Why would you should care - well look at this thread, that's why.

HD 7970M was ok for single cards but was a disaster in Crossfire. A lot of us purchased this card for use in crossfire and did not have working drivers over a month and the blame game shifted between AMD and OEM vendors. See here for reference: A Consumer Nightmare – Alienware M18x-R2 + AMD 7970M Crossfire

2. G-Sync? The feature that comes standard with DP 1.4 specs? Again that's nVIDIA we are talking about - money, money, money. "Hey let's milk 200 - 250 dollars more for a display (with a board that's likely in the pocket money range) that would either way come after an year or so, but we wont be getting any then, but now we can throw this as an exclusive feature and etc." LOL! They are not creating anything new, they are just profiting on other people's ideas.*

DP 1.4a did not have the specs for this UNTIL NVIDIA released G-Sync and AMD went running to VESA to implement this as a counter to G-Sync. In fact, Adaptive Sync/FreeSync is still not on the market while G-Sync is a proven technology. It requires an aftermarket board but that's because it pre-dates DP 1.4a and the additonal memory on there holds frames in buffer for when fps drop below 30. See here for why this is a good thing: Mobile G-Sync Confirmed and Tested with Leaked Alpha Driver | PC Perspective The final scalers in desktop monitors should be able to account for that but we still don't know for sure. But what we do know for sure is that the G-Sync module does and it also has undisclosed features that NVIDIA plans to reveal. If you want to give credit where it's due, then NVIDIA deserves a lot of credit for finally solving a very old and frustrating problem of jitter + tearing.

3. Why not? Really why not? I'll give an example with engines - air cooled engines can go up to 210ºC, but the water cooled ones up to 110ºC. Same goes to horsepower, you look at the specs on paper, but the actual performance may and will differ. What I mean with this? I always said and will continue to do so that TDP is something to base your guess on, but it's not exact science! The actual performance may and will differ especially from one manufacturer to another. And to get to the ºC - if they say that it's right why not? If it fails, they are the ones that would cover it. We are returning back to the main topic - nVIDIA is locking down and limiting in order to cover some f*%#$-up they did. Another notable mention is the entire MXM 8 series line-up. How about that, did we ever got confirmation that they indeed screwed and things will be fixed and people refunded or something?

Even if AMD says 95C is within spec for their GPU core, that doesn't mean that other components on the board (like VRMs) won't degrade faster from the additional heat. Furthermore, it limits overclocking potential (290/290x can't OC for shit) and it also tosses more heat into your case and/or home. Ever play a game during a hot summer's day in the desert of Arizona? Yeah well you wouldn't want a couple of 95C cards pumping out heat during those summer months.

4. Well if you were to wait the nVIDIA's answer it wouldn't have been such a problem now wouldn't it? Even they had their delays, even tough they are dealing with a lot more resources.

Sure everyone makes bad decisions now and again, but as I said nVIDIA is in my black list for doing stupid things pretty much non-stop.

GameWorks? How about Mantle? AMD was the first one around to come-up with this kind of tech (quick glimpse at GameWorks), so who is playing catch-up? I think nVIDIA was pretty butt-hurt that they haven't came-up with something like that. Mantle was meant to be open source, but whats the point if there's no interest? Well, Intel actually expressed interest, but I seriously doubt this is what AMD was hoping for. I mean, Intel CPUs are more powerful and that's no secret. In the APU market AMD wins because of the better iGP. Now combine Mantle with Intel's strong CPU and average iGP, and you get quite the package. A package that would entirely demolish AMD to be honest. So yeah, I can see why Mantle hasn't picked-up on anything but AMD. Why bother dealing with the competition when they can come-up with more-or-less the same on their own? I mean, it's the idea that's important, of course the realization as well, but they have the money to throw, so they'll fix it one way or another. Then comes DX12.

PhysX? Again - nothing theirs, thankfully they had the money to buy them.

G-Sync? Already said what I had to.

* I give them credit for the MXM tough. Don't know what their intentions were, and if it's really their idea (haven't dug deep enough) but that's the single greatest thing that nVIDIA has ever made. Kudos to them for this. I'm serious.

TL;DR

Can add some more to each point, but is it really necessarily? If don't get my point by now adding more words would do nothing. It already is quite the wall of text, so most people would skip it anyway. Everyone is buying Intel and nVIDIA by default. It takes a miracle to someone actually buy AMD. So how on earth AMD can improve when there are no money coming their way?

I have even less income and what I get for my money is well worth it in my eyes. I can't justify spending a lot more and getting marginally better product. I mean, yeah 980m is what it is - pretty fast, but could've been better AND they cut the overclocking. Tell me, how that is not marginally better product? I actually would consider it worse. You get cut down chip and no overclocking, lovely. Would get the hell out of my 7970m then comes the desktop. (if Clevo doesn't release anything new with AMD chip on it, still have my hopes)

Mantle was a way for AMD to cover up for the poor CPU overhead of their DX 11 drivers. Don't believe me? Take a look at these D3D 11 batch times:

post-5-1449499943323_thumb.png

or take a look at this:

post-5-14494999433489_thumb.png

AMD has been notoriously bad with D3D CPU overhead and they still haven't fixed any of it, instead they use Mantle as a band-aid. It was never meant to as an open source solution and that's why they claim it's still in "beta". But as you mentioned, DX 12 has made it irrelevant. Also, NVIDIA are the one's that brought back SLI to gaming, prior to that 3dfx were the one's that had it before folding and having their patents purchased by NVIDIA. However, NVIDIA's implementation is completely different than the one 3dfx used, the only thing they have in common is the name. When SLi first came out, ATi's response was of course to follow the leader with their own AFR implementation of Crossfire. What about poor frametimes in Crossfire? That plagued AMD since it's inception until last year when the fcat controversy finally forced them to address the lingering problem. In fact, I mentioned that Crossfire had problems with stutter in our review: AMD 7970M Xfire vs. nVidia GTX 680M SLI Review

Crossfire and SLI both work relatively well most of the time. Anyone familiar with multigpu technology knows that microstuttering has been a problem with such configurations since they were first released. AMD and NVIDIA have both made great strides in addressing this but it has still not been entirely eliminated by either manufacturer. However, during game play we noticed that the Crossfire setup did experience more noticeable stuttering even when the frame rate was well into the 60s or higher. Case in point is Battlefield 3, when playing on Grand Bazaar using the 7970M Crossfire, microstuttering was very evident yet the same was not true on the same map and server with the 680M SLI. In Batman Arkham City while running the built in benchmark, I noticed areas of very visible stuttering that simply were not there with the 680M SLI and again the same was experienced in Borderlands 2.

This is long before PCPer and other websites ever had an idea that there was a stutter problem with Crossfire. In fact, the only other website to pick up on this was Hard|OCP in their reviews. Unfortunately, this still seems to be the case in games like Battlefield Hardline where AMD doesn't have the benefit of the Mantle crutch:

post-5-14494999434247_thumb.gif

From Hard|OCP:

AMD GPU Performance

There is a big difference currently in the gameplay experience between NVIDIA GPUs and AMD GPUs in Battlefield Hardline. This comes down to what we have tested as the frame times. For whatever reason AMD GPUs are doing very poorly in terms of frame time in this game. There are large peaks of longer periods of time between frames. There is wild inconsistency, and a general higher frame time average than with NVIDIA GPUs.

This translates into the game feeling choppy even though the frame rates are high on AMD GPUs. It affects every AMD GPU we tested, Radeon R9 290X, Radeon R9 290 and Radeon R9 285. It affects every map we played, the larger maps like Dust Bowl were the worst. This is a dramatic and devastating detriment to gameplay on AMD GPUs. As it is right now, NVIDIA GPUs simply offer smoother gameplay in Battlefield Hardline

There is one month until release of this game. AMD right now has the opportunity to improve gameplay on its video cards by the time the game releases. Otherwise, there will be a lot of complaints from AMD video card users.

Source: HARDOCP - Conclusion - Battlefield Hardline Video Card Performance Preview

So those of us that buy NVIDIA hardware do it because we know their hardware and software is just more refined than what AMD offers. AMD has improved along the way but many times it's a case of too little too late as with Crossfire (their XDMA implementation is great). However, now with NVIDIA's arrogance growing by bounds, AMD has the perfect chance to clench some customers that had traditionally written them off. Now is the time for them to step up with R300 and offer a comprehensive desktop and mobile solution that is on par with NVIDIA. If they can deliver, they will regain a lot of their lost market share.

post-5-1449499943378_thumb.gif

post-5-14494999434009_thumb.gif

  • Thumbs Up 1
Link to comment
Share on other sites

...

We are really getting lengthy, so I cut you off :P

I'll get to multi-GPUs later.

That's your take on G-Sync and DP standards and I'll need proof about it. I'll tell you this tough - "Version 1.3 was published in February 2011; it includes a new Panel Self-Refresh (PSR) feature developed to save system power and further extend battery life in portable PC systems." That's a quote from eDP's 1.3 revision, and this is what AMD wanted to be passed to DP as well. So how is this bad? More monitors could be made without additional proprietary modules and would be compatible with everything. Again, how is this bad? This is the technology that G-Sync is based on, nothing revolutionary. The hardware and the specs wasn't there on the desktops, so they just cashed on it. Of course it does its wonder, but could've ask VESA to include what was already developed in the sub-standard in question. Trust me, you don't need to show me the advantages, I know them, that's why I really hope something like FED or SED would come around. I'm keeping an eye on CNT-FED, I hope it would hit the market eventually. Also, that's why I have FW900 ;)

I've been in Spain (40ºC, 104ºF), so I know what it is to have a hot desktop in a hot room. Again if the manufacturer is there to warrant it, why not? Not to mention that most of those get whatever was on them thrown away and are water cooled, and you can link it to the boiler instead... profit? I think yes. A hot shower after an intense session, lovely. I see nothing about the mobile 8 series from you. Where is your manufacturer?

So this is your response to Mantle? So you are blaming them for not being innovative, yet your response is this? Even if it's indeed to fix their unoptimized performance, it does gains performance now doesn't it? So how is this not a good thing? I mentioned DX12, which is fine, everyone would get metal touch and orgasms would flow everywhere... if you are on W10. You know, W7 and Mantle is the preferred combo in my book. So is that a plus on AMD's side? Not to mention that this DX12 most likely wouldn't have been what it is now if it wasn't Mantle. I have no proofs about it, but there are none about the opposite as well. I'm pretty certain it was the Android and iOS case all over again. As I said before, the idea is a really damn important thing. The realization is no less important, but you have to have an idea to have a starting point!

Oh yes, I forgot about SLi, another technology that has been bought. I know about the implementation, but should I repeat myself about the ideas? It's third time already. But I will repeat myself about implementation. So here is again my talk about idea and realization. So nVIDIA bought SLi (patents, 3dfx, whatever) and they've changed it upside-down. They have the money to do it. What about CrossFire - well that's the result done with less money. Are you following? Here's an idea - it wasn't AMD's, but it wasn't nVIDIA's either and we can see the end result depending on the money thrown. Oh and do you think or implying that SLi is perfect? No, I wont respond with anything, I know the answer, I want to hear/read it from you. I'll say this - I'll consider multi-GPU setup when the technology is there. For now it's mostly (not entirely) for number chasing. There's performance to be gained... when it works. When it doesn't - it's 500 to 1000 (probably more) dollars in hardware that you switch off to play on single... Right. Sorry but I like when I buy something to actually use it. XDMA is getting things close. Then comes the game support, which is also part of the implementation. Before you jump on me with - yeah but SLi suck because the developers. You have nice technology - great, what it means without anyone using it? Same goes to Mantle, which is sad, because it is THE ONLY way to stick to W7 AND have great performance.

I agree, it's now or never. If AMD doesn't cash on this, they are out. Would they cash, is entirely up to you. nVIDIA is playing it attention whore the whole damn time, as soon as they start to lose attention, they play some cheap trick and regain the attention. They have the means, they have the money, they can do whatever they want. Feel free to buy from them again, soon it will be the only choice.

Link to comment
Share on other sites

This is terrible news for us laptop gamers who loved to overclock..looks like all of us are going to be married to the driver 344.75 id rather switch to amd if NVIDIA doesnt re enable overclocking or switch to desktop this is retarded..

I was looking desperately for first news about the R9 M300 series allready... Unfortunately it won`t be all to strong. On the other hand: I bought nVidia recnetly because they could be maxed out by overclocking. Without this "feature" I wonder if I am going to buy one of those cards again - especially as they are almot twice as expensive.

About beeing married to the 344.75/80: This will surely change when the drivers that support DX12 are out. Then one will have to make a tought decission. I guess nVidia - as much as I despise it - has picked the time for this step wisely. Ans sooner and many of us woud have thought twice before buying a deadend card like the 970m/980m.

Best regards

phila

P.S.: I wonder if there is a chance that modders like the guys form LaptopVideo2go can edit the latest drivers accordingly?!

P.P.S.: Sorry for the double post!!!

Link to comment
Share on other sites

@svl7, @Prema, maybe you guys can have a look at the newest drivers and spend some time figuring out if anything can be done? I'll be happy to donate if some modded drivers can be made..

Good luck, if this is what it will come to.

Maybe start by looking in nvapi.dll and nvapi64.dll, check for differences between 347.xx and prior drivers...

Link to comment
Share on other sites

I'm pissed about this move too but in the case of the 980M it may well save some cards. Anyone else notice that it's missing a VRM across the top leaving just the solder point where it would go while the 970 has all 4? I'm not an electrical engineer or anything, maybe the 3rd inductor offsets the need for that VRM but what if it doesn't? Too much voltage and boom Fermi series all over again?

http://rjtech.com/shop/images/detailed/DSC01680.jpg

Look at the top right of the card.

Sent from my Nexus 6 using Tapatalk

  • Thumbs Up 1
Link to comment
Share on other sites

  • Founder
I'm pissed about this move too but in the case of the 980M it may well save some cards. Anyone else notice that it's missing a VRM across the top leaving just the solder point where it would go while the 970 has all 4? I'm not an electrical engineer or anything, maybe the 3rd inductor offsets the need for that VRM but what if it doesn't? Too much voltage and boom Fermi series all over again?

http://rjtech.com/shop/images/detailed/DSC01680.jpg

Look at the top right of the card.

Sent from my Nexus 6 using Tapatalk

Could very well be part of the reason but the answer I think is probably more complex than just a single point. Most likely a bunch of reasons factored into their decision ranging from OEMs not liking users holding on to systems longer (hence the sudden move away from MXM), bumpgate, cost cutting on the 980M board (though I can't imagine much savings by shaving off a single VRM), the TDP issue @Prema mentioned etc. Plus NVIDIA also knows they have the mobile market cornered so they can kinda do what they want now so why not just force people to pay more for faster refreshes? Helps relieve the R&D strains by allowing them to milk a design a lot longer than it normally would be allowed.

Ultimately, whatever their reason may be, it sucks for current Maxwell owners. Best way is to spread the word about this change and tell people to skip Maxwell mobile products and see what AMD's R9 300 offers (and damn do I hope it's something good). But if AMD fails to deliver, well then I recommend this to make it hurt a little less:

post-5-14494999434551_thumb.jpg

I just finished upgrading my system but I may put my money where my mouth is in Jan 2016 by going AMD again if R9 is good - I'd really like it if AMD built it on 20 nm.

  • Thumbs Up 1
Link to comment
Share on other sites

Could very well be part of the reason but the answer I think is probably more complex than just a single point. Most likely a bunch of reasons factored into their decision ranging from OEMs not liking users holding on to systems longer (hence the sudden move away from MXM), bumpgate, cost cutting on the 980M board (though I can't imagine much savings by shaving off a single VRM), the TDP issue @Prema mentioned etc. Plus NVIDIA also knows they have the mobile market cornered so they can kinda do what they want now so why not just force people to pay more for faster refreshes? Helps relieve the R&D strains by allowing them to milk a design a lot longer than it normally would be allowed to do.

Ultimately, whatever their reason may be, it sucks for current Maxwell owners. Best way is to spread the word about this change and tell people to skip Maxwell mobile products and see what AMD's R9 300 offers (and damn do I hope it's something good). But if AMD fails to deliver, well then I recommend this:

[ATTACH=CONFIG]14034[/ATTACH]

It most likely does indeed have to do with the OEMs wanting yearly upgrades... I mean everything is going soldered... Clevo releasing a soldered system is just blasphemous...

All they are going to do is drive everyone to alternatives. Portable desktops, full size desktops for home use and a cheap Intel laptop for travel, things like the Surface which is getting better and better with each generation, tablets, or maybe AMD will pull a rabbit out if it's hat.

As an owner of a 980M SLI machine, I'm annoyed but the cards are so strong that I can play Witcher 2 with a custom 3K resolution downsample and average 50 FPS... I've got Prema's mod vbios and I see no need to overclock although I found out that my cards do 1300MHz on stock voltage.

Sent from my Nexus 6 using Tapatalk

Link to comment
Share on other sites

  • Founder
It most likely does indeed have to do with the OEMs wanting yearly upgrades... I mean everything is going soldered... Clevo releasing a soldered system is just blasphemous...

All they are going to do is drive everyone to alternatives. Portable desktops, full size desktops for home use and a cheap Intel laptop for travel, things like the Surface which is getting better and better with each generation, tablets, or maybe AMD will pull a rabbit out if it's hat.

As an owner of a 980M SLI machine, I'm annoyed but the cards are so strong that I can play Witcher 2 with a custom 3K resolution downsample and average 50 FPS... I've got Prema's mod vbios and I see no need to overclock although I found out that my cards do 1300MHz on stock voltage.

Sent from my Nexus 6 using Tapatalk

If we (the community) put histrionics aside and look at the big picture, it may not necessarily be the end of high end notebook gaming like we think. I think it's evolving away from MXM and more towards what AW has done with external high speed e-GPU solutions for gamers that want bleeding edge speed. So you get a slim laptop with perhaps a soldered on 980M class card that doesn't OC but you can attach a desktop GM200/R9 390x to it and OC it to your hearts desire (well PSU permitting). This way you have a mobile gaming notebook and the power of a desktop on tap as well. I kinda like this and figured it would happen but I also wish CPUs weren't going the soldered route either but what can ya do, it's not a perfect world.

Take a step back and look at where gaming notebooks were headed, it was a bit obscene until Maxwell arrived. We had Clevo releasing dual 240W (or was it 300W?) PSUs just to cope with SLI and AW was pushing the limits of their SLI systems and the PSUs. That's why we had hardcore enthusiasts like @Mr. Fox and @StamatisX building custom dual PSU solutions. But not everyone is as hardcore as they are nor willing to do that kind of work just to be on the bleeding edge and it probably wasn't financially feasible or sensible for companies like AW to keep pursuing a design philosophy that always requires bulk, cost and extra measures to control TDP. The logical solution is the one AW took and I applaud part of it but at the same time I think soldering the CPU was a huge mistake. But nonetheless, after thinking about it, I think e-GPU solutions, if given a chance and a bit more creativity can lead gaming notebooks into a whole new era of awesome performance. OEMs won't have to contend with designing bulky laptops that have to dissipate 100W+ TDP and can instead make sexier and higher quality notebooks that have the e-GPU option available for high performance gaming.

Hell, give me a really nice Razer/MBP type notebook w/just a decent soldered GPU (e.g. 970M) with the ability to hook it to a universal e-GPU (not proprietary like AW) connection and I just may purchase it. And personally, I've written off gaming notebooks for nearly 2 years (hence the desktop in my sig) but this could get me to look at them again. I probably wouldn't use the e-GPU part for myself but I wouldn't mind getting my gf into PC gaming and what better way than a sexy new laptop that can also double as a high performance gaming monster via e-GPU that isn't confined by notebook thermals? Win/win.

  • Thumbs Up 2
Link to comment
Share on other sites

I'd have absolutely no problem with an eGPU if they get the same bandwidth a desktop would give it. In fact I'd prefer it. If that's their plan, I'm all for it. It would drive the price of desktop cards up though. Oh and the dual PSU solution is 330W each, I have it.

Sent from my Nexus 6 using Tapatalk

Link to comment
Share on other sites

  • Founder
I think I will put this right here, just in case NVIDIA decides to delete it later on... feel free to use the link to this image as you see fit. [http://i.imgur.com/LUVLHtu.png]

It might be useful as a plaintiff exhibit some day.

LUVLHtu.png

Doesn't the above address PC desktop gaming though? I mean they even recommend using watercooling and mention well ventilated cases. I don't see anything about notebooks in there.

Link to comment
Share on other sites

Look what I found lurking in my old emails, LOL.

-----Original Message-----

From: NVIDIA GPU Litigation Settlement Administrator [mailto:[email protected]]

Sent: Sunday, December 19, 2010 2:52 PM

To: mr_fox_rox

Subject: NVIDIA GPU Litigation Email Registration Confirmation

Dear Mr. Fox:

You have successfully registered your email address to receive future updates regarding the NVIDIA GPU Litigation.

Link to comment
Share on other sites

Doesn't the above address PC desktop gaming though? I mean they even recommend using watercooling and mention well ventilated cases. I don't see anything about notebooks in there.
I see no disclaimers for notebooks in the article. In fact, taken on face value one my conclude the opposite of what you are saying. Without a disclaimer, they are on the hook. Notebooks can have well ventilated cases.

post-119-14494999434852_thumb.jpg

Ha bumpgate settlement?
Yup... ;)
Link to comment
Share on other sites

  • Founder
I see no disclaimers for notebooks in the article. In fact, taken on face value one my conclude the opposite of what you are saying. Without a disclaimer, they are on the hook. Notebooks can have well ventilated cases.

[ATTACH=CONFIG]14037[/ATTACH]

Yup... ;)

I dunno man, I think the intent is pretty clear that it's about desktops but maybe some would also think it extends to notebooks. But the way they talk about system modifications (watercooling, optimizing case air flow) isn't something people typically associate with notebooks. I really don't see any litigation arising from this at all.

  • Thumbs Up 1
Link to comment
Share on other sites

Doesn't the above address PC desktop gaming though? I mean they even recommend using watercooling and mention well ventilated cases. I don't see anything about notebooks in there.

Yeah, I don't think it helps our cause to advertise that link! At worst it might make desktop users feel safe that overclocking will remain with them in the future, we shouldn't advertise that link - we want the support of the desktop community. Of course, if NVidia lock down mobile NVidia GPUs then we all know it's more likely that they'll end up doing it to desktop's in the future too.

Link to comment
Share on other sites

I dunno man, I think the intent is pretty clear that it's about desktops but maybe some would also think it extends to notebooks. But the way they talk about system modifications (watercooling, optimizing case air flow) isn't something people typically associate with notebooks. I really don't see any litigation arising from this at all.
All is fair in law and war. Involvement in civil litigation has been part of my career. If they do not clearly state the intent to exclude notebooks, intent to the contrary, whether real or fabricated, does not count.

C'mon... stupid broad sues McDonald's for getting burned with hot coffee? Because they did not advise her it was hot? Burglars sue homeowners for medical bills and general damages (pain and suffering) for being shot (not killed) or beaten and get away with it? This would count against them, big time. Once it is admitted into evidence by a judge, the jury decides how to interpret it and an unstated intent is meaningless.

I bet if we took time to look we could find (and capture to preserve) overclocked notebook benchmarks from Pidge and ManuelG as well. The say nothing against it, do it in public, provide tools facilitate it, then change their "permission" suddenly and unexpectedly. There's something behind this and I suspect it has nothing to do with notebook overclocking in a general sense. I smell an engineering defect they are trying to conceal instead of holding themselves accountable for mistakes. If they carte blanche take a position against it, then their current product's failure (Maxwell) will be a "see we told you so" instead of "this is our fault, please let us fix it" scenario. Bumpgate 2.0.

Link to comment
Share on other sites

  • Founder
All is fair in law and war. Involvement in civil litigation has been part of my career. If they do not clearly state the intent to exclude notebooks, intent to the contrary, whether real or fabricated, does not count.

C'mon... stupid broad sues McDonald's for getting burned with hot coffee? Because they did not advise her it was hot? Burglars sue homeowners for medical bills and general damages (pain and suffering) for being shot (not killed) or beaten and get away with it? This would count against them, big time. Once it is admitted into evidence by a judge, the jury decides how to interpret it and intent is meaningless.

I bet if we took time to look we could find (and capture to preserve) overclocked notebook benchmarks from Pidge and ManuelG as well. The say nothing, do it, then change their "permission" suddenly and unexpectedly. There's something behind this and it nothing to do with notebook overclocking in a general sense. I smell an engineering defect they are trying to conceal.

Well with the messed up legal system we have in the US, I guess that can always be a distant possibility. Nonetheless, looks like daddy AMD has something up it's sleeve: M295X MXM card using AMD Radeon

post-5-14494999441399_thumb.jpg

post-5-14494999435108_thumb.png

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.