Jump to content
2% OFF CODE FOR WLMOUSE PRODUCTS! ×

PASCAL-MXM & P-SERIES REFRESH


Guest

Recommended Posts

1 hour ago, Dr. AMK said:

Dear friend, just listen to @Mr. Fox advice in this regards and you will not regret that. I'm using a 4K 55" LG TV as a monitor and I can feel the 4K details practically, 17" 4K in my opinion is just a waste of money, but for rich people it will be nice to have especially if it will support the 120Hz :) 

 

I was referring to the 1080p 120hz monitors (Def don't want 4k..more so I don't want 60hz). 

I've been informed by Hidevolution that the current 120hz monitor is also shipped with a non-Gsync gpu. Ie. If you were to swap over to a gysnc display on the laptop itself,  it still wouldn't work. That would be really crappy when it comes to selling someone the old card at upgrade time as most people would have Gsync displays. 

 

Additionally I'm not clear on what display will be on offer soon other than it being 120hz and Gsync. Is is going to be 5ms?   I don't mind if it's TN or IPS. 

 

Edited by TBoneSan
  • Thumbs Up 2
Link to comment
Share on other sites

Thank you @Prema for the explanation, So that's mean, if someone buy the  P870DM2 with one card and in the future decide to upgrade to SLI,  he will needs to buy the second GPU and the 'GRID' heat sink as well, but what kind of problems he will face? and what do you mean " The GRID heatsink does fit into the DM2 but NOT the original P870DM model."? 

 

The point I 'm trying to reach, can the P870DM2 owners with 1XGTX1080 upgrade their laptops to 2XGTX1080 SLI straightforward or not? if not, what they should do to have the upgrade done? please explain more.

DM2 owner can upgrade to DM3, if they can source and are willing to pay for:

- 1x 1080 GPU

- DM3 GRID heatsink

- 2x330W AC Adapter

Link to comment
Share on other sites

16 minutes ago, TBoneSan said:

 

I was referring to the 1080p 120hz monitors (Def don't want 4k..more so I don't want 60hz). 

I've been informed by Hidevolution that the current 120hz monitor is also shipped with a non-Gsync gpu. Ie. If you were to swap over to a gysnc display on the laptop itself,  it still wouldn't work. That would be really crappy when it comes to selling someone the old card at upgrade time as most people would have Gsync displays. 

 

Additionally I'm not clear on what display will be on offer soon other than it being 120hz and Gsync. Is is going to be 5ms?   I don't mind if it's TN or IPS. 

 

I didn't play games too much in the laptop screen or even doing my work on it, so I really don't care much about that, my investment goes to an external big high end quality monitor, I found that more reasonable for me.

I attached my monitors for work in office, and for my home, maybe they are not the best out there, but I'm happy with them so far.

LG 34um95-for my work.png

LG_uf950t_For Home.jpg

  • Thumbs Up 1
Link to comment
Share on other sites

5 hours ago, Dr. AMK said:

Thank you @Prema for the explanation, So that's mean, if someone buy the  P870DM2 with one card and in the future decide to upgrade to SLI,  he will needs to buy the second GPU and the 'GRID' heat sink as well, but what kind of problems he will face? and what do you mean " The GRID heatsink does fit into the DM2 but NOT the original P870DM model."? 

 

The point I 'm trying to reach, can the P870DM2 owners with 1XGTX1080 upgrade their laptops to 2XGTX1080 SLI straightforward or not? if not, what they should do to have the upgrade done? please explain more.

Yeap, Its as easy as adding a second card and adding the Grid Heat sink. And the Grid isnt as expensive as one might expect. And that you have 2 x 330W adapters.

 

EDIT : I guess Prema answered. 

Edited by bloodhawk
  • Thumbs Up 4
Link to comment
Share on other sites

Yeap, Its as easy as adding a second card and adding the Grid Heat sink. And the Grid isnt as expensive as one might expect. And that you have 2 x 330W adapters.

 

EDIT : I guess Prema answered. 


How much is a grid?

Sent from my SM-G900P using Tapatalk

  • Thumbs Up 1
Link to comment
Share on other sites

We need that MSI rep to come over here so we can speak freely without being censored.

 

http://forum.notebookreview.com/threads/official-questions-for-the-msi-rep-2.795226/page-4#post-10326391

 

Dear Loyal Gamers,


We have heard and read each and every one of your post in the past weeks and are taking this issue 
very seriously with our engineering teams and external partners. As a GT owner, I also made the 
decision to purchase these laptops for its innovative MXM format hoping to extend the life of my 
laptop so we feel your frustration.

As everyone knows, the performance of the latest GeForce® GTX 1080/1070/1060 chips has jumped 
significantly over the last generation. This could not have been possible without redesigning 
the motherboard, modifying power outputs, thermals and other components in order to utilize 
the full performance of Pascal chips. MSI has been working internally and with NVIDIA to find 
a suitable upgrade solution in order to meet the demand of GPU upgrade without causing any 
disruption to your experience.

In order to keep our original promise of upgradeable GPU’s, MSI will be offering a trade-in 
program for those who currently have GT72/GT80 models with GTX 800 and 900 series graphics 
inside to new GT72/GT83 with GTX 10-series packed with our latest gaming features. 

For the owners of mentioned products, you can download and fill out the attached form. 
Please send the form to: [email protected] before Oct. 31st 2016

MSI will send the specifics of this trade-in program soon.

Please note that this program is available in USA only. 

We appreciate all your feedback.
Thank you for showing up and saying pleasant things to try to calm people down. That's mighty nice of you. 
No disrespect to you personally, sir, for the things I am about the say. I realize you are the messenger. 
As such, please take a message home to MSI for us. I'm going to share honestly what needs to be said.

This is a bunch of double-talk scripted BS. NVIDIA has bent us all over and nailed us from behind super 
hard with Pascal and we need MSI and Clevo to draw a line in the sand. Things have never been so messed 
up with NVIDIA and Intel and I'm fed up with all of the nonsense, as well as the namby-pamby OEMs that 
are willing to let them get away with this nonsense. I have been a high performance notebook 
enthusiast/overclocker for a very long time and I have never been more dissatisfied with the filth I am 
seeing from you guys. What happened with the Pascal GPU refresh is an absolute farce. Millions of 
customers, thousands of whom have machines that are still brand new, shafted in a single bound by 
proprietary crap GPUs. MSI and Clevo are partially to blame for tolerating stupidity from the 
idiots at NVIDIA and Intel.

Clevo and MSI are the last two standing in what used to be a fairly vibrant niche with MXM and it is 
incumbent on both companies to partner up and work with each other to stand against the evil intentions 
of the Intel and NVIDIA Mafia. MSI, in particular, needs to man up and knock it off with the stupid 
crippled BGA feces CPUs and start using at least K-series desktop quad core CPUs like Clevo. I mean, 
c'mon man... Titan beast machine with a wimpy candy-butt wuss BGA piece of trash CPU. What a stinking 
joke. Just because a lot of people are too stupid to no say no to BGA doesn't mean it is OK to take 
advantage of them. You're taking advantage of the ignorance of the masses and that ain't cool. MSI 
could potentially rule the roost with excellent build quality and intelligent design, but the half-
hearted BGA filth CPU is a deal killing piece of trash that the company should be ashamed of.

We want you to be successful, but all we are seeing is a trend in failure upon failure lately. Now 
that Alienware has turned to crap, MSI and Clevo are the only high performance notebook OEMs that 
enthusiasts can put their hope in, and all MSI is bringing to the table is MXM. Whoopty-doo. You 
guys are way lots better than that if you can get your act together. If you really honest to God 
understand our frustration, for pity's sake stop creating it.

Take that insulting trade-in bait and switch ponzi scheme somewhere else, step up to the plate 
and actually fix the mess you helped create. Otherwise many of us that have not already will be 
abandoning notebooks and going back to desktops. All you will have left for customers will be 
netbook gamer children that don't have a clue what quality and value look like, much less show-
stopping performance.

Again, this message is for HQ, not for you. You're awesome. Have a nice evening and thanks for 
listening and taking good notes.

As a side note, the most powerful laptop of 2015 shown in my signature is for sale because of 
the problem I outlined here. The proprietary crap has castrated it and made it essentially a 
worthless dead end like all of the lesser products of yesterday that were killed by the NVIDIA Nazis.

 

Link to comment
Share on other sites

2 hours ago, Dr. AMK said:

This video is nice, but at the end something strange happens, the CPU Temp hit the 99 degree??!! is this normal?

No. The heat sink probably is not fitting correctly. This seems to be a frequent issue for Clevo. Now that both GPUs use a unified heat sink design the chance of having difficulties has doubled. 

Link to comment
Share on other sites

No. The heat sink probably is not fitting correctly. This seems to be a frequent issue for Clevo. Now that both GPUs use a unified heat sink design the chance of having difficulties has doubled. 


Don't forget with vapor chamber, you can't try to correct it yourself without higher risk of destroying the heat sink also!

Sent from my SM-G900P using Tapatalk

  • Thumbs Up 3
Link to comment
Share on other sites

11 hours ago, Mr. Fox said:

No. The heat sink probably is not fitting correctly. This seems to be a frequent issue for Clevo. Now that both GPUs use a unified heat sink design the chance of having difficulties has doubled. 

"The heat sink probably is not fitting correctly" is that really possible ? , or maybe the thermal past is not installed correctly.

Edited by Dr. AMK
Link to comment
Share on other sites

4 hours ago, Dr. AMK said:

"The heat sink probably is not fitting correctly" is that really possible ? , or maybe the thermal past is not installed correctly.

Sorry, I thought you were saying the GPUs were getting too hot. I did not watch the video. Could still be a poorly fitting heat sink though. A couple of P870DM-G owners have had issues with heat sinks for the CPU not fitting properly. @bloodhawk was one and I think maybe it was @TBoneSan that needed to replace his. Likewise, 6700K needs to be delidded and pasted with Liquid Ultra to run at its coolest. If you do much overclocking with the stock paste under the IHS the temps are too high.

 

The new "Grid" GPU heat sink may also be contributing to high CPU temperatures. I'm not particularly keen on the idea of moving heat from the GPUs to CPU. If the CPU is already running too hot that might just make it even hotter.

Link to comment
Share on other sites

On 8/25/2016 at 2:30 PM, ajc9988 said:


How much is a grid?

Sent from my SM-G900P using Tapatalk
 

Depending on where you get it from, around $100. 

29 minutes ago, Mr. Fox said:

Sorry, I thought you were saying the GPUs were getting too hot. I did not watch the video. Could still be a poorly fitting heat sink though. A couple of P870DM-G owners have had issues with heat sinks for the CPU not fitting properly. @bloodhawk was one and I think maybe it was @TBoneSan that needed to replace his. Likewise, 6700K needs to be delidded and pasted with Liquid Ultra to run at its coolest. If you do much overclocking with the stock paste under the IHS the temps are too high.

 

The new "Grid" GPU heat sink may also be contributing to high CPU temperatures. I'm not particularly keen on the idea of moving heat from the GPUs to CPU. If the CPU is already running too hot that might just make it even hotter.

Yeap, i got so frustrated with the heat sink that i did this to it : http://imgur.com/a/Gw2XM

@Dr. AMK

Edited by bloodhawk
  • Thumbs Up 5
Link to comment
Share on other sites

11 minutes ago, bloodhawk said:

Depending on where you get it from, around $100. 

Yeap, i got so frustrated with the heat sink that i did this to it : http://imgur.com/a/Gw2XM

@Dr. AMK

I was referring to the CPU heat sink. Did you have to replace that as well, or did I confuse you with someone else? Your GPU heat sink mod was definitely cool stuff.

 

Now that I think back, I believe that @Phoenix also had a bad CPU heat sink on his machine.

Link to comment
Share on other sites

Yeap, i got so frustrated with the heat sink that i did this to it : http://imgur.com/a/Gw2XM

@Dr. AMK


The grid is CHEAPER than the unified heatsink in the ZM! If it works so well, why don't they replace the zm and dm heatsinks with a straight vapor chamber (that's right, they'd have to make it with less variance and their QA with Foxcon sucks)...

Sent from my SM-G900P using Tapatalk

  • Thumbs Up 1
Link to comment
Share on other sites

3 hours ago, Mr. Fox said:

I was referring to the CPU heat sink. Did you have to replace that as well, or did I confuse you with someone else? Your GPU heat sink mod was definitely cool stuff.

 

Now that I think back, I believe that @Phoenix also had a bad CPU heat sink on his machine.

Ah , actually yeah. I replaced that as well. And shimmed it. 

  • Thumbs Up 2
Link to comment
Share on other sites

@Mr. Fox yeah I replaced mine too and got a decent fit after that. The first one was rubbish. Who ever passed it through QC owes me $100 and needs to be sent to the naughty corner.

Edited by TBoneSan
  • Thumbs Up 2
Link to comment
Share on other sites

@Prema, Any thoughts on this cooling gimp with P775DM2/3

 

 

Post@NBR

Quote

There was 7 heat pipes on the old model with 980m, now current model with 1080 only gets 5 heat pipes and 1080 has higher TDP than 980m, now the result is the cpu is toasty on the new model

 

Post@NBR

Quote

Also notice that besides less heat pipes, in the P775DM2 the heat pipes don't run all the way to the end of the copper fin array, but they do in the P775DM1
 

 

Also the fans seems to look small vs the old 60mm fans, Though they  are 12v ones..edit : And thick blades vs the old thin.

Edited by Ashtrix
  • Thumbs Up 2
Link to comment
Share on other sites

3 hours ago, Clyde said:

Compal make barebones for Clevo or I missed something. :blink:

 

6 hours ago, Z3us said:

Isnt it a proof that there will be an upgrade for 1070? https://youtu.be/6zZWikvDLPw

 

Guys, chill that's an MSI 1070 card (QS), It was posted here long back and even before that the MSI 16L1 barebone was there a.k.a Eurocom Tornado F5 now, Yes it does have the LGA socket and green mobo as opposed to black mobo with the new MSI machines, Also that card won't fit normal 3.0b mxm machines, dremel needed, with eDP LCD display, vBIOS & finally a EC hard mod for now on GPU (Again everything was posted by Prema here) to make them even work with P7,8 3 series & rest 3.0b MXM machines.  Also to let you know in case you've missed check the OP - 1060 MSI QS card with proper 3.0b mxm size, everything above applies to any Pascal upgrade on a Maxwell HW, wondering why ? Ngreedia !!

Edited by Ashtrix
  • Thumbs Up 1
Link to comment
Share on other sites

3 hours ago, knight said:

Clyde, ty żyjesz! O.o

I don't think that 1070 will fit standard MSI or Clevos.  :/ There will be problem with EC unless some HW Mod or heavy soft "key".

 

Of course I'm alive. Simply, since Clevo change all the engineers to the accountants,  I don't have to whom to talk. Physical values vs virtualization. :lol:

 

13 minutes ago, Ashtrix said:

 

 

Guys, chill that's an MSI 1070 card (QS), It was posted here long back and even before that the MSI 16L1 barebone was there a.k.a Eurocom Tornado F5 now, Yes it does have the LGA socket and green mobo as opposed to black mobo with the new MSI machines, Also that card won't fit normal 3.0b mxm machines, dremel needed, with eDP LCD display, vBIOS & finally a EC hard mod for now on GPU (Again everything was posted by Prema here) to make them even work with P7,8 3 series & rest 3.0b MXM machines.  Also to let you know in case you've missed check the OP - 1060 MSI QS card with proper 3.0b mxm size, everything above applies to any Pascal upgrade on a Maxwell HW, wondering why ? Ngreedia !!

 

It was just a rhetorical question.

  • Thumbs Up 2
Link to comment
Share on other sites

  • Brian unpinned this topic

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.