Jump to content
Prema

PASCAL-MXM & P-SERIES REFRESH

Recommended Posts

34 minutes ago, octiceps said:

 

4-way "works", but it's a terrible experience just watching the video, and I'm not even playing the game. FPS and GPU usage drops all over the place, hitching, massive microstutter whenever the camera is turned, frame time spikes into the hundreds of ms range, etc. 99th percentile frame rate during that run was probably under 20 FPS. This is actually perfect evidence of why SLI is for number-chasers, not for a smooth gaming experience. I'd love to see Digital Foundry get their hands on a Titan XP 4-way and FCAT it so they can tear that guy to shreds.

You take it with a grain of salt.

You, just like I don't know if that's the game or the video.

And for a real test he would either need to do what linus did or get online and actually try to kill someone with a sniper rifle. Or have a gun battle. Which we do not see at all.

And he should have done one at 4k so we could have a better understanding of just how well or not well it works. Some of my videos look choppy, but the game was running perfectly fine....So you can't really just base it off of that. 

 

As to tear to shreds, they already did that, but he deleted all the post about it. And the funny thing about it is....The way they got sli to run. I did that with my first gen titan x's. Buy using all 8 tabs. So that was interesting. I don't know about that pcie stuff in the bios though.

  • Thumbs Up 2

Share this post


Link to post
Share on other sites

Hmm. I don't know who Digital Foundry is, or what "FCAT it" means. Nor do I really care at this point. I still thought it looked good enough. I'd be more interested in seeing his benchmark scores than some silly GTA V video with all of the display settings jacked up. I kind of doubt that anything is going to really run worth a damn with such ludicrous display settings.

 

I thought Thirty IR's post from 4 days ago was pretty awesome. I don't let trolls post comments on my YouTube channel either. I moderate everything and if I don't like what people have to say I delete their comments without approving them and then ban them from my channel. Nobody gets to see any comments except for those that I approve of.

  • Thumbs Up 2

Share this post


Link to post
Share on other sites
2 minutes ago, Mr. Fox said:

Hmm. I don't know who Digital Foundry is, or what "FCAT it" means. Nor do I really care at this point. I still thought it looked good enough. I'd be more interested in seeing his benchmark scores than some silly GTA V video with all of the display settings jacked up. I kind of doubt that anything is going to really run worth a damn with such ludicrous display settings.

 

I thought Thirty IR's post from 4 days ago was pretty awesome. I don't let trolls post comments on my YouTube channel either. I moderate everything all comments and if I don't like what people have to say I delete their comments without approving them and then ban them from my channel.

They weren't trolling. He said he was first. and he wasn't. LOL and they clowned him on it. That's what you don't see im afraid.

 

I posted a link to the guy that was actually first, but if I find my pictures of mine i'm going to be bucking for first. :D

Edited by johnksss
  • Thumbs Up 2

Share this post


Link to post
Share on other sites
11 minutes ago, johnksss said:

They weren't trolling. He said he was first. and he wasn't. LOL and they clowned him on it. That's what you don't see im afraid.

Yeah, I didn't see that part. If he deleted the comments, then I did not see them. Maybe he honestly thought he was first and someone needed to point out that it wasn't true. Sounds like the record got set straight, so that's good. I don't know if that rises to the level of deserving to be torn up though. That seems a little harsh.

 

But, I would not be surprised if  some of the clowning got out of control and deserved to be deleted. Some of the most retarded and hateful people I have ever seen, and the most idiotic comments I have ever seen, are posted by imbeciles on YouTube and Facebook. I don't let people poop on my living room carpet or post stupid crap on my channel and neither should anyone else. 

  • Thumbs Up 3

Share this post


Link to post
Share on other sites

Of course. That's a given. That happens everywhere. People go overboard and just don't want to let things go until they get their two cents in, but they weren't talking about textures and 8k/4k and ms this and that. They were talking you your a lier. You weren't first. Not giving credit to the guy who was first and who showed him how to do it in the first place. The same type of shenanigans that vbios files don't get made or bios files don't go to certain vendors. So that's why he post some comment in the video frames about him talking to venturi and they settled it, but that was not what he was telling the world for the first 1 minute of his video. First this and that. Go tell the world. I was first and blah blah blah. :D

 

That's like someone coming and saying the did the first teardown video of a P870DMG. I'm going to be right there calling them a liar point blank! When I know Mr. Fox was first (Well wait, were you first with one? lol)person to even have the machine. And guess what? So will everyone else around here and over at nbr. :D

 

And that's what happened on his channel.

 

You know it's all for one and one for all.

Edited by johnksss
  • Thumbs Up 3

Share this post


Link to post
Share on other sites

Oh, OK. I gotcha. If he did that on purpose (lied) then he deserved to have people calling him out for it... definitely not right to do that kind of stuff. But, I missed all that. And, I will be first to admit I didn't look at the situation very closely. 

 

 

  • Thumbs Up 2

Share this post


Link to post
Share on other sites
52 minutes ago, Mr. Fox said:

I don't know who Digital Foundry is, or what "FCAT it" means.

Digital Foundry is a company that basically analyze game graphics and compare it to consoles. They're very thorough people, and one of the very few groups that are heavily into tech that actually advocate understand how much high speed and low latency on RAM can help a system. Their "is the i5-2500K still relevant" has some very good points about it; they even manage to get a solid 30% extra performance just getting some good 2133MHz RAM onto it over 1333MHz... which then scaled with overclocking.

 

FCAT, to be simple, is basically reading the data of the timings of frames sent from the GPU to the display. As far as I know, it only works with DVI connections, so Polaris and Pascal cards haven't had FCAT testing done on them. Basically, it's testing the frametimes, which is the time between each frame. Let's say you've got 120fps constant. You should be having exactly 8.33ms between each frame for it to be perfectly smooth. If your framerate is 120 but your frametimes are say... 2ms 12ms 12ms 18ms 3ms 3ms etc (I'm just being wild with this) your game will look like your FPS is about 20, not 120. High FPS and stuttery gameplay is not a pleasant experience, though to my knowledge it has little effect on benchmark scores since in a one-second timeframe, whether your frametimes are consistent or all over the place, the same number of frames in one second is still delivered (if I'm wrong, do correct me).

 

In other words, for the purpose of gaming, frametimes being consistent is important. 4-way SLI works, as far as I know (from nVidia's explanations anyway), by using SFR of AFR. Something like this:

 

Top half of screen

--------------------------------

Bottom half of screen

 

Cards #1 and #3 run two-way AFR on the top half, and cards #2 and #4 run two-way AFR on the bottom half. SFR is split-frame rendering (there's a ton of other names for it too, but I'll simply stick with that one) where each card runs one portion of the screen. So it's using the AFR of two GPUs as "GPU 1" and the AFR of the other two as "GPU 2" and running SFR. It generally needs a lot of bandwidth and driver optimization to have consistent frame timings (and often brings negative scaling due to lack of bandwidth in some titles), but benchmarks love it.

 

On the other hand, gaming on it with bad frame timings is an absolute pain, and many consider the higher FPS counts not worth the stutter.

 

On the opinion side, if someone got 4-way SLI working on Titan X Pascals and is enjoying the life out of himself? Well sure, kudos to him. I'm in the boat where I'd need things to be smooth enough to play, but I'm not going to rain on another's parade for being happy. I WILL however attack people who lie about things, because it gives other people the wrong idea, usually about what to expect from spending cash on certain products, but about other things too.

 

1 hour ago, johnksss said:

Edit: It says i'm using just under 10 gigs of video memory for 2560x1440P@120hz GTAV

Curious, but what software reads out this 10GB for you?

Edited by D2ultima
  • Thumbs Up 6

Share this post


Link to post
Share on other sites
1 hour ago, johnksss said:

You, just like I don't know if that's the game or the video.

 

The microstutter was apparent throughout the video, and the frame time spikes could be verified on the OSD (despite not being accurate since only FCAT can accurately measure FPS and MSPF in mGPU), so clearly the game was running sub-optimally.

 

Anyway forget all the statistics that you benchmarkers go gaga over. Stop looking at the OSD and just watch the gameplay. It doesn't even pass the eyeball test for smoothness.

Edited by octiceps
  • Thumbs Up 1

Share this post


Link to post
Share on other sites

At the end of the day. Does not matter to me. It was only to say it works. You are more than welcome to go take it up with him though instead of over here with us speculating. Asking the source is clearly a better option right?

Share this post


Link to post
Share on other sites

Check his other recent videos. How well it works depends on the game. Crysis 3? Great, but we knew this years ago. BF4? Not so great, but still not bad. GTA V? Awful.

 

Anyway I already gave my opinion when I downvoted that video, so there's that. ;)

 

P.S. I'm not disagreeing about whether 4-way works or not, just its practicality and effectiveness. Nvidia is king of artificial limitations so it doesn't surprise me at all that end-users have found workarounds since this is what we've always done. You know that on a 4-way setup, you can force a game to run in n-way mode just by changing one bit in the SLI profile?

Edited by octiceps

Share this post


Link to post
Share on other sites
On 9/1/2016 at 2:11 PM, Prema said:

 

So yeah, that's the MSI 1080 SLI without external power connection.  

Will add clean shots shortly...

Again a changed DIE position compared to its solo counterpart. Also keep in mind that (just like their 980s before) MSI SLI versions are lower powered than their solo model counterparts.

Now that little bit extra there is just their style of humor...what a royal waste of space on an otherwise perfectly good board.


This is really a joke. I'm still waiting here for first pictures of triangular or circular PCBs

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
On 9/3/2016 at 8:00 PM, D2ultima said:

Curious, but what software reads out this 10GB for you?

GTAV. To me that is a weird reading since the max on each card is 8196

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
3 hours ago, Bugii said:


This is really a joke. I'm still waiting here for first pictures of triangular or circular PCBs

Why would they need an additional power plug? After all they can not cooling the card even without. The good news is that with this arrangement of supply components it have a chance to cool without involvement through the paths. :lol:

Edited by Clyde

Share this post


Link to post
Share on other sites

 

Un-hide this ~33299 points 3D11 1080 SLI result please:

 

 

 

3D11.jpg
 

 

 

:D

 

  • Thumbs Up 6

Share this post


Link to post
Share on other sites
6 hours ago, johnksss said:

GTAV. To me that is a weird reading since the max on each card is 8196

GTA doesn't read vRAM properly. It doubles vRAM in SLI for one thing. If GTA said you were using "10GB" in SLI, then you were using around 5GB. Because it used to tell me I use 6.6GB and obviously I only have 4GB, and OSD readouts were around 3200MB.

 

Here:

GTA5_2015-09-17_03-00-15.jpg

 

Old screenshot, but you can see its readout is bugged.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites
1 hour ago, D2ultima said:

GTA doesn't read vRAM properly. It doubles vRAM in SLI for one thing. If GTA said you were using "10GB" in SLI, then you were using around 5GB. Because it used to tell me I use 6.6GB and obviously I only have 4GB, and OSD readouts were around 3200MB.

 

Here:

GTA5_2015-09-17_03-00-15.jpg

 

Old screenshot, but you can see its readout is bugged.

I have seen other games reporting similar erroneous information about VRAM before. I generally just ignore it. If the game is performing well I don't even bother checking anything except CPU and GPU temps.

  • Thumbs Up 2

Share this post


Link to post
Share on other sites
10 hours ago, johnksss said:

Desktop cards are now caught.

Spoiler

 

b05PLTn.png pUaIhzw.png

 

 

Awesome stuff. But, it only looks like desktops are caught when using the right tool for the job. A BGA turdbook isn't going to get the job done, or so it seems based on what has been uploaded.

 

Here is a comparison of the best BGA CPU 1080 SLI scores (MSI) against the best 1080 SLI scores on a proper machine with sockets and slots (Clevo). There are a lot of Clevo scores above the top MSI scores. The highest MSI scores look to be submitted by arthas2005 and he is not in the top 10 on any of them. I think I used the phrase "blood bath" once before.

 

3DMark 11: http://www.3dmark.com/compare/3dm11/11554627/3dm11/11555847

 

Fire Strike: http://www.3dmark.com/compare/fs/9954310/fs/10053831#

 

Time Spy: http://www.3dmark.com/compare/spy/356775/spy/393681

 

No link is provided for Vantage. It appears nobody with a 1080 SLI BGA turdbook has submitted a Vantage score.

 

I have nothing for ASUS because I didn't drill down far enough on page to see if ASUS even had any scores uploaded.

 

MmKrnsp.jpg

  • Thumbs Up 8

Share this post


Link to post
Share on other sites
5 hours ago, Mr. Fox said:

Awesome stuff. But, it only looks like desktops are caught when using the right tool for the job. A BGA turdbook isn't going to get the job done, or so it seems based on what has been uploaded.

 

Here is a comparison of the best BGA CPU 1080 SLI scores (MSI) against the best 1080 SLI scores on a proper machine with sockets and slots (Clevo). There are a lot of Clevo scores above the top MSI scores. The highest MSI scores look to be submitted by arthas2005 and he is not in the top 10 on any of them. I think I used the phrase "blood bath" once before.

 

3DMark 11: http://www.3dmark.com/compare/3dm11/11554627/3dm11/11555847

 

Fire Strike: http://www.3dmark.com/compare/fs/9954310/fs/10053831#

 

Time Spy: http://www.3dmark.com/compare/spy/356775/spy/393681

 

No link is provided for Vantage. It appears nobody with a 1080 SLI BGA turdbook has submitted a Vantage score.

 

I have nothing for ASUS because I didn't drill down far enough on page to see if ASUS even had any scores uploaded.

 

MmKrnsp.jpg

 

This comparisons give a clear message and kill the arguments about the BGA, good job @johnksss , waiting for @Mr. Fox to give us the end of this beast :)

  • Thumbs Up 2

Share this post


Link to post
Share on other sites
6 hours ago, Mr. Fox said:

Copying and pasting from NBR:

I thought something was very odd about this... and I was right. In Firestrike ALONE he runs at "4GHz" (the rest have a slower reported clockspeed, unlike John's benches). But his physics score is too low. Here's @Spellbound's physics score from a random bench she did with stock CPU clocks: http://www.3dmark.com/fs/9595299
 
Note how her physics score was 12500? The 6920HQ at 4GHz (I know it can clock up to and hold 4GHz on all 4 cores without issue as long as the laptop isn't limiting it somehow) only had 11059 for Physics. That's too large a discrepancy; it was throttling pretty hard. My 3.8GHz 4800MQ can get about 10400 with crap RAM; Skylake at 3.8GHz or a bit less might hit 11,000. But that wasn't 4GHz.
  • Thumbs Up 1

Share this post


Link to post
Share on other sites
17 minutes ago, D2ultima said:

Copying and pasting from NBR:

I thought something was very odd about this... and I was right. In Firestrike ALONE he runs at "4GHz" (the rest have a slower reported clockspeed, unlike John's benches). But his physics score is too low. Here's @Spellbound's physics score from a random bench she did with stock CPU clocks: http://www.3dmark.com/fs/9595299
 
Note how her physics score was 12500? The 6920HQ at 4GHz (I know it can clock up to and hold 4GHz on all 4 cores without issue as long as the laptop isn't limiting it somehow) only had 11059 for Physics. That's too large a discrepancy; it was throttling pretty hard. My 3.8GHz 4800MQ can get about 10400 with crap RAM; Skylake at 3.8GHz or a bit less might hit 11,000. But that wasn't 4GHz.

This is exactly why things go the way they do.

Why don't you try matching it with a SINGLE gpu setup like it was meant to be compared to in stead of thinking you found the holy grail of mistakes made. :frantics:

http://www.3dmark.com/compare/fs/9988160/fs/9595299

 

Edit:

And I have explained this thing with physics quite a few times already, but no one seems to get it so I shut up about it. Those that figure it out see the gains while others do not.

Edited by johnksss

Share this post


Link to post
Share on other sites
1 minute ago, johnksss said:

This is exactly why things go the way they do.

Why don't you try matching it with a SINGLE gpu setup like it was meant to be compared to in stead of thinking you found the holy grail of mistakes made. :frantics:

http://www.3dmark.com/compare/fs/9988160/fs/9595299

What. Are. You. Talking. About?

 

I am claiming, that the person Mr. Fox compared your score to, who benched in firestrike with his 6920HQ at 4GHz, only managed 11k physics, whereas the benchmark I posted managed 12.5k physics, which means that the 6920HQ model was throttling.

 

Why are you showing me one of your benches against the one I linked? There is no reason to do this?

  • Thumbs Up 2

Share this post


Link to post
Share on other sites

 http://www.3dmark.com/fs/9595299

That is a 6700k, where do you get 6920hq from that?

 

I think you might want to go back and check first before jumping the gun my friend....

 

These are more closer to what you should be looking at.

http://www.3dmark.com/compare/fs/10011170/fs/10001580/fs/10059807

And like i have pointed out many times before. A single gpu physics score will always be higher unless the person benching does not know what they are doing by a very very long shot.

(Apparently not the case with 6920HQ and 980N. See below)

 

http://www.3dmark.com/compare/fs/9269885/fs/7575204/fs/8619427/fs/7602138/fs/9312667/fs/10053831

 

But this one is the one that would prove your point..

And now I understand why hmscott made the comment about higher physics with dual cards. That would seem to hold true for the 6920HQ.

http://www.3dmark.com/compare/fs/10053831/fs/7808124/fs/8437252/fs/8124431

Edited by johnksss
  • Thumbs Up 1

Share this post


Link to post
Share on other sites

I did a little benching with moderate overclocks connected via Team Viewer to Eurocom's campus with an unattended Sky X9E2 this afternoon. I was not able to control the fans and the CPU temps are totally out of control, so this was with whatever thermal paste they have and sitting on a flat surface with some pretty severe CPU thermal throttling. I could not push the CPU any further without direct access to the machine, but still not too shabby all things considered. With a delid and Liquid Ultra the CPU should be totally fine, just as it was with the P870DM-G. The GPUs were warm but not overheating. I had to also play it conservative since they are about 1500 miles away and I had to avoid doing anything to cause the machine to freeze or lock up, as they are closed for the day (after hours). All things considered, I am really impressed. I used the Clevo CPU and GPU overclocking tools in Clevo Control Center.

 

http://www.3dmark.com/3dm11/11556701

fn73xo5.jpg

 

http://www.3dmark.com/fs/10072213

G06O4s3.jpg

 

http://www.3dmark.com/sd/4289532

FxpmICm.jpg

  • Thumbs Up 4

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.