Guest Posted September 3, 2016 Share Posted September 3, 2016 6 hours ago, Prema said: John setting the record straight, I like that! @ MSI , Asus and Acer (huh?NICE!) 1080 SLI, that's Clevo with stock firmware for you! /index.php?/profile/244-meaker/" id="ips_uid_5465_5" rel="">@Meaker do your worst! /index.php?/profile/62-mr-fox/" rel="">@Mr. Fox we need you in the DM3 family!!! Let the games begin while we prepare the Mods... Trust me when I say I want to be. I already miss my DM-G. So, I am going to put my desktop plans on the back burner while we see if there is a way to make it happen. 2 hours ago, Brian said: Something that future owners of these SLI notebooks should keep in mind is that it is a complete fail so far in dx 12 with virtually no titles supporting it or crossfire since that duty now falls to developers. You will still do fine in most dx 11 titles but with dx 12 taking over, don't get your hopes up about mgpu and SLI support. If you're a benchmark chaser, well carry on As a regular gamer, I ended up with a single desktop Titan X Pascal for that very reason. It's the first time I skipped SLI in years as its future for now seems questionable outside synthetic benchmark software. Something worth reading: http://www.babeltechreviews.com/sad-state-crossfire-sli-today/3/ Yeah, it really is a shame that the lowest common denominators damn the world with their low standards, but since I am a benchmark chaser will carry on as suggested. The only real down side is the added cost of a second GPU, SLI bridge and better PSU. When and where it works you get to kick ass. When and where it doesn't you can dedicate it for PhysX and still enjoy and equal or better experiences as those those one GPU. So, if a person can afford the extra hardware it's still a win. I'd love to have Titan X Pascal in SLI, but wow... lots of money. 0 minutes ago, johnksss said: And this would make for a good argument on why they wanted to put desktop type gpus in notebooks instead of mobile. They could cut out sli/crossfire and just set the power to match that of the desktop counterpart. Although I don't see them matching the pricing any time soon though. Indeed... very disproportionate how they rape notebook owners on pricing. One Titan X is basically the same price as a Clevo 1080 MXM GPU... $1,200 USD. One 1080 notebook GPU is almost double the cost of a desktop 1080. The performance is getting closer but the unfavorable price structure is going the exact opposite direction. We used to bitch about 780M and 980M pricing, but they were a bargain compared to now. Went from 50% more expensive to double. Quote Link to comment Share on other sites More sharing options...
Guest Posted September 3, 2016 Share Posted September 3, 2016 11 minutes ago, Mr. Fox said: Trust me when I say I want to be. I already miss my DM-G. So, I am going to put my desktop plans on the back burner while we see if there is a way to make it happen. I forced myself to stay in my office for a few days this week to use my P750ZM after working in the same spot all day, the same as I would have to with a desktop. I'd be lying if I said I like that. Being tethered to a desk sucks. I'm back in the living room with Mrs. Fox and grandkids with the P750ZM in my lap. Definitely a better way to live than hiding out in my "dungeon" LOL. Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 3, 2016 Share Posted September 3, 2016 (edited) In my situation, that would not be a good thing. Don't want anyone around my LN2/Dry Ice/Phase Change or Water Chiller. That could turn out disastrous. And there is way to much expensive stuff sitting around that could potentially end up broken unfortunately. And the grand kids have there own gaming rig which I really need to ship off to them next week. lol And when benching with ln2, you do not have time to be monitoring anyone but what you are doing. lol Edited September 3, 2016 by johnksss 4 Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 3, 2016 Share Posted September 3, 2016 (edited) 2 hours ago, Mr. Fox said: When and where it doesn't you can dedicate it for PhysX Sorry to burst your bubble Mr. Fox, but PhysX is dead as a doorknob since current-gen consoles and their successors are all AMD silicon. Your only hope for PhysX is GameWorks titles, and we all know what the general consensus on that is, despite the fact that AMD's equivalent--GPUOpen--is as bad if not worse (case in point: Deus Ex: Mankind Divided). Edit: And you certainly don't need a very powerful GPU as a dedicated PhysX PPU. Even a lowly 750 Ti is all you need for a 1080: Edited September 3, 2016 by octiceps 4 Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 3, 2016 Share Posted September 3, 2016 (edited) Well damn, then there goes that idea..... And I'm still not selling my second card. Edited September 3, 2016 by johnksss 4 Quote Link to comment Share on other sites More sharing options...
Guest Posted September 3, 2016 Share Posted September 3, 2016 22 minutes ago, octiceps said: Sorry to burst your bubble Mr. Fox, but PhysX is dead as a doorknob since current-gen consoles and their successors are all AMD silicon. Your only hope for PhysX is GameWorks titles, and we all know what the general consensus on that is, despite the fact that AMD's equivalent--GPUOpen--is as bad if not worse (case in point: Deus Ex: Mankind Divided). Edit: And you certainly don't need a very powerful GPU as a dedicated PhysX PPU. Even a lowly 750 Ti is all you need for a 1080: No worries. Bubble is still intact and will be for the foreseeable future. Using the second GPU for PhysX would be a creative use for it when effed up games don't support SLI, not a primary use for it. PhysX has never been ubiquitous because of the NVIDIOTS and their greedy self-centered approach to literally everything they do. The destiny of PhysX was doomed by the greed of its creator, which is truly a shame. But, everything else still applies. Bench with SLI and game where supported, and where it's not supported you game like a single GPU peasant. Still a win, just not as much as it used to be. I cannot be content to settle for single GPU since I'm not really a gamer-boy. Gaming is not the most important thing for me... overclocked benching is. Gaming comes in second or third place behind benching, pretty much tied with watching movies as a form of entertainment when I am not benching. The mediocrity of AMD and consoles have more or less ruined gaming for everyone by turning it into a hobby dominated by peasants. It used to be something I was passionate about, and now it's something I only do for cheap thrills as an alternative to boredom. There are a few FPS titles I am very passionate about and enjoy immensely, but I can count the number of them on my fingers with a digit or two left over. Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 3, 2016 Share Posted September 3, 2016 5 hours ago, Brian said: Something that future owners of these SLI notebooks should keep in mind is that it is a complete fail so far in dx 12 with virtually no titles supporting it or crossfire since that duty now falls to developers. As a regular gamer, I ended up with a single desktop Titan X Pascal for that very reason. It's the first time I skipped SLI in years as its future for now seems questionable outside synthetic benchmark software. Something worth reading: http://www.babeltechreviews.com/sad-state-crossfire-sli-today/3/ As far as I know, it can be done driver-side. Just like all the other "optimizations" that DX12 is bringing... every dev is hurry to toss DX12/Vulkan on a game but they don't bake a single iota of optimizations into it, and let the driver do all the optimizations. In this sense, it's basically DirectX 11 with less CPU usage, and that's nothing special whatsoever (proper coding wouldn't be eating up i7s for breakfast like 92% of AAA titles since 2015 do). In fact, that's pretty much the reason nVidia has been doing worse in DX12 and Vulkan than without... it's because their DX11 drivers have as much CPU overhead as possible removed, and are very mature. So since DX12/Vulkan relies on the (immature) drivers for optimizations and there is very little to no more CPU overhead removed in the driver (though in-game CPU usage is lowered), they perform worse. So basically, we've been waiting on drivers to give us DX11-like performance or better with DX12/Vulkan. On the other hand, it's the reason why AMD performs much better. Since DX12/Vulkan kills their ridiculous CPU overhead in the driver for DX11 titles, their cards perform close to what the hardware actually allows. So take DX12/Vulkan performance in AMD cards, add about 5-10% from driver optimizations (just like nVidia had to do) and THAT is what AMD cards should be doing in DX11 today. Hence my intense annoyance at them for focusing on everything that doesn't matter in competing right now. That's what I would do... except I would just get two Titan X Pascals =D. I'm irrational when it comes to this... single GPU will just never be enough for ol' D2. But yes, it's extremely questionable. The problem is that while every dev under the sun is happy coding AFR-unfriendly tech into games, none of them give a flying meowmix about actually optimizing the damn games. "Unreal engine 4 omg" is so popular in gaming, but then there's games I can barely get 60fps on low in while forcing SLI (Dead by Daylight is a good example) whereas single GPU I get 100+fps maxed in Unreal Tournament 4. If every game ran like UT 4, I'd be perfectly fine. But not every game does, and that's a rather large issue. And it keeps happening with most every title I see, especially ones on Unity 5/Unreal 4. Did you know it takes approximately 85-95% of one of my 780Ms to render Street Fighter 5 at 1080p with the settings turned up? You know, the game that looks like this? Compare that to UT4 here. You understand what I mean. 2 hours ago, Mr. Fox said: So, I am going to put my desktop plans on the back burner while we see if there is a way to make it happen. We used to bitch about 780M and 980M pricing, but they were a bargain compared to now. Went from 50% more expensive to double. Ah you and I are so alike in our irrationality. We both so don't want to give up our notebooks. 780M was $860 and 770 4GB was $450 though. 970 was $330 and 980M was $720. We've been around or over double price for quite some time. The 1080 is replacing the 980 though, which was $1200 over the $550 desktop model, so... if anything... price markup has actually gone down? =D? =D =D? (Yeah I know it's still garbage). 2 minutes ago, Mr. Fox said: Bench with SLI and game where supported, and where it's not supported you game like a single GPU peasant. I AM DISAPPOINTED IN YOU MISTER FOX. D=. Let me fix your statement. "Bench with SLI and game where supported, and where it's not supported you force SLI on that bastard with nVidia Profile Inspector like a true enthusiast until it works and laugh at people who don't know how to do this, and still game in SLI." There, I fixed your statement for you. 5 Quote Link to comment Share on other sites More sharing options...
Guest Posted September 3, 2016 Share Posted September 3, 2016 22 minutes ago, D2ultima said: Let me fix your statement. "Bench with SLI and game where supported, and where it's not supported you force SLI on that bastard with nVidia Profile Inspector like a true enthusiast until it works and laugh at people who don't know how to do this, and still game in SLI." There, I fixed your statement for you. Thanks. That's exactly what I meant to say. LOL. Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 3, 2016 Share Posted September 3, 2016 (edited) 38 minutes ago, Mr. Fox said: Thanks. That's exactly what I meant to say. LOL. Yes. GOOOD. Mwahahahahaha. Also, @Brian, thank you for that link to SLI charts testing. I knew all of that basically in my head, but it is an EXTREMELY nice collection of such testing (since I do not have the hardware myself) that I can show to other people, and it's already been added to my SLI guide. Edit: Brian, do you think you could relax my forum signature rules to 3 URLs? I kind of cannot change my signature at all without having to remove one of the already-existing links, and I wanted to update the specs portion a bit. Or relax the whole forum's requirements a bit works too. Edited September 3, 2016 by D2ultima 1 Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 3, 2016 Share Posted September 3, 2016 Well, scaling has been made to work with 4 Titan XP's..... Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 3, 2016 Share Posted September 3, 2016 (edited) 38 minutes ago, johnksss said: Well, scaling has been made to work with 4 Titan XP's..... In benchmarks, sure. But a lot of recent games like Doom and The Witcher 3 are quite PCIe bandwidth heavy (even assuming HB Bridge) at native 4K with temporal AA, so you need PCIe 3.0 x16 per card for positive/optimal scaling. Which means only HEDT platform with 40-lane CPU would suffice, and rules out anything above 2-way SLI. Edited September 3, 2016 by octiceps 2 Quote Link to comment Share on other sites More sharing options...
Founder Brian Posted September 3, 2016 Founder Share Posted September 3, 2016 Unfortunately newer engines like UE4 and Unity lack SLI support and given how lazy most developers are, SLI is showing decreased support in newer dx 11 titles as well. In many of those new games, forcing SLI does nothing for scaling. Keep in mind gpu usage and scaling are two very different things and the evidence so far shows little to no benefit with SLI these days in newer titles, especially if you go beyond two GPUs. And as as I said, dx 12 all but ensures SLI and crossfire are dead. As octiceps pointed out, this is compounded by the fact that AMD silicon dominates consoles and only a tiny handful of games today are PC first and those that are normally don't need more than a single gpu for max settings. Even nvidia is scaling back SLI support so that should be a wake up call that it is dying out. It's okay (for now) if you chase numbers but most people buy these systems for gaming and with that fact in mind, it's better to skip SLI and get the single most powerful GPU. You get much lower frametimes and 100% scaling with no microstutter. If I was in the market for a new pascal notebook and an avid gamer, I'd save money and pass on the dual gpu and instead put that money into a bigger and better ssd. Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 3, 2016 Share Posted September 3, 2016 3 minutes ago, Brian said: Unfortunately with newer engines like UE4 and Unity lack SLI support and given how lazy most developers are, SLI is showing decreased support in newer dx 11 titles as well. In many of those new games, forcing SLI does nothing for scaling. Keep in mind gpu usage and scaling are two very different things and the evidence so far shows little to no benefit with SLI these days in newer titles, especially if you go beyond two GPUs. And as as I said, dx 12 all but ensures SLI and crossfire are dead. As octiceps pointed out, this is compounded by the fact that AMD silicon dominates consoles and only a tiny handful of games today are PC first and those that are normally don't need more than a single gpu for max settings. Even nvidia is scaling back SLI support so that should be a wake up call that it is dying out. It's okay (for now) if you chase numbers but most people buy these systems for gaming and with that fact in mind, it's better to skip SLI and get the single most powerful GPU. You get much lower frametimes and 100% scaling with no microstutter. Yes, I know that scaling and utilization are different. But I DO notice better frames with SLI on versus off; otherwise I disable it. Dark Souls 3 is one such game where SLI on is negative scaling. But yes, as I said: single strongest GPU before even considering SLI. And there is *NO* AMD card I would Crossfire on the market. None. Crossfire *IS* dead. SLI's benefit from nVidia Inspector helps. I've been able to get SLI working with positive scaling on multiple games, even UE4 ones, that don't support SLI from the get-go. Killing Floor 2 three months before it got an official profile. Ark: Survivial Evolved which will never get a profile. Unreal Tournament 4 can have SLI forced, though TAA shouldn't be used. How to Survive 2 worked well with forced SLI and allowed me to max it and keep my 120fps constant where single GPU didn't allow that. Toxikk also was easily forced. Overwatch I forced before its profile appeared in the driver and worked swimmingly. Crossfire can't do any of that. *BUT* I will be clear: SLI is not for somebody who isn't fully willing to do a lot of elbow grease to get it working. It just is not. 1080s are the best for the mobile platform, so it makes sense to SLI it. If they shoved a single Titan X Pascal in a P870DM3 and got it working I'd grab that over the 1080 SLI in a heartbeat. But as of now, it isn't here. Desktops, as I said, Titan X Pascal before SLI-ing. If they're running DX12 like they're running DX11 (which they are) then nVidia can add DX12 profile bits. NVPI already has DX12 bits in a section, though since I refuse to touch Windows 10 on this PC it's a pointless option for me. But it means the driver supports it, and it can be enabled in games. A simple "force AFR1" or "force AFR2" won't cut it, though. However, as things currently are, there really is no benefit to multi-GPU in DX12/Vulkan titles. We'll see how that pans out in the next few months, but I think DX12 is worse for gaming in its current state than it is good for gaming. Vulkan is middle of the road. No multi-GPU and requiring driver-side optimization of games is as stupid as DX12 is, however it does not require Windows 10, and can be run on Linux and Mac, and that makes all the difference in the world. And yes, I know they're scaling it back. I think though it's more that they can't attack the bandwidth problems without losing money (XDMA-style design of cards, or consumer NVLink in non-proprietary format, possibly using PCI/e interfaces). Since they can't charge for the proper solution, they axe 3-way and 4-way and just use a doubled-up LED bridge, call it "High bandwidth", and call it a day. It's a band-aid on the problem, but one that costs them basically next to nothing. I'll be watching multi-GPU closely, but I already tell everyone to ignore it. I don't even mention forcing it, or anything like that. If somebody already knows about it, they'll get multi-GPU anyway. Anybody asking the question just won't enjoy it. It makes me laugh at the 1070 SLI models of laptops that are coming out from MSI and such. Those are pointless. 3 Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 4, 2016 Share Posted September 4, 2016 I was only point out what some have done. And that game in question was GTAV. As to how well it was working I have no idea. Only mentioning that it was done. Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 4, 2016 Share Posted September 4, 2016 3 minutes ago, johnksss said: I was only point out what some have done. And that game in question was GTAV. As to how well it was working I have no idea. Only mentioning that it was done. Interesting... though I expect four would've been a stutter fest. I would say 3 might perform better. It's too bad, I'd have liked to have seen that. Maybe a 4.8GHz 5960X with three Titan XPs finally getting GTA V to max out and hit 120fps constantly. Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 4, 2016 Share Posted September 4, 2016 (edited) 23 minutes ago, D2ultima said: Interesting... though I expect four would've been a stutter fest. I would say 3 might perform better. It's too bad, I'd have liked to have seen that. Maybe a 4.8GHz 5960X with three Titan XPs finally getting GTA V to max out and hit 120fps constantly. Although he did some post clean up and retracted part of his story...LOL This is at 8k from what he speaks. The original first 4 way sli working... http://forums.guru3d.com/showthread.php?t=409468 Edited September 4, 2016 by johnksss Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 4, 2016 Share Posted September 4, 2016 1 minute ago, johnksss said: Although he did some post clean up and retracted part of his story...LOL This is at 8k from what he speaks. There is smooth. And then there is that. BTW that's not 8K. That's 4K with 4x DSR. 1 Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 4, 2016 Share Posted September 4, 2016 (edited) 3 minutes ago, octiceps said: There is smooth. And then there is that. BTW that's not 8K. That's 4K with 4x DSR. I know that my friend. it's mentioned in the video as well. Edited September 4, 2016 by johnksss Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 4, 2016 Share Posted September 4, 2016 (edited) 9 minutes ago, johnksss said: Although he did some post clean up and retracted part of his story...LOL This is at 8k from what he speaks. That is not 8K. He's using way too little video RAM for 8K (DSR grants the same vRAM hit as actually using the resolution if I remember correctly). 4GB of vRAM is BARELY enough for GTA V at 1080p "maxed" out. I've asked Tgipier from NBR (don't believe he's on T|I) to test for me going from 1080p to 3440 x 1440 with the game on absolute maximum, and he crosses 5GB vRAM just doing that. 4K or 8K would be much higher. He has at LEAST turned off MSAA, possibly more. I mean, good for the fact that 4 Titan XP cards work, certainly. But something's pretty off about his metrics in that video. He should have shown us his options menu if he's making those claims. I hate to be a skeptic about these things, but there's far far too many people who say things like "I max witcher 3 with a 970 and I get like 144fps" and they've turned off all gameworks options and AA and the like. Edited September 4, 2016 by D2ultima 1 Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 4, 2016 Share Posted September 4, 2016 1 minute ago, D2ultima said: That is not 8K. He's using way too little video RAM for 8K (DSR grants the same vRAM hit as actually using the resolution if I remember correctly). 4GB of vRAM is BARELY enough for GTA V at 1080p "maxed" out. I've asked Tgipier from NBR (don't believe he's on T|I) to test for me going from 1080p to 3440 x 1440 with the game on absolute maximum, and he crosses 5GB vRAM just doing that. 4K or 8K would be much higher. He has at LEAST turned off MSAA, possibly more. I mean, good for the fact that 4 Titan XP cards work, certainly. But something's pretty off about his metrics in that video. He should have shown us his options menu if he's making those claims. I hate to be a skeptic about these things, but there's far far too many people who say things like "I max witcher 3 with a 970 and I get like 144fps" and they've turned off all gameworks options and AA and the like. Err the VRAM usage reporting is bugged. It says >4TB in that video. Been a bug in Nvidia driver for the last several releases. 2 Quote Link to comment Share on other sites More sharing options...
D2ultima Posted September 4, 2016 Share Posted September 4, 2016 Just now, octiceps said: Err the VRAM usage reporting is bugged. It says >4TB in that video. Been a bug in Nvidia driver for the last several releases. That's TB? I saw it do something similar in John's fallout 4 video, but I thought it was "4GB" as in "4,xxx,xxxKB" for some reason. Ah well. I'm still a skeptic. 2 Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 4, 2016 Share Posted September 4, 2016 Whatever it doesn't matter. I logged into my Google account just so I could downvote that video. Couldn't stand the way he prattled on despite on-screen evidence showing the game running like shit. 3 Quote Link to comment Share on other sites More sharing options...
johnksss Posted September 4, 2016 Share Posted September 4, 2016 (edited) Lets get something straight about the post. 1: We all know it's in reality 4k or so he says.(considering that osd is pretty damn big for 4k or even fake 8k) I thought that was a given. And in his dialog about it? I already told you about his post clean ups and retractions and blah blah who cares. This is at 8k from what he speaks. I never said it was 8k. We are jumping the gun on what was said... All I said was he got 4 way working. What should have been the topic is 4 way working, not 5 reasons about everything other than the 4 way working. When i ran Battlefield 4 at 200% it was running 8 gigs of video memory so to me something is amiss or missing, but did not take the time to go tear apart his video for the answer. Edit: In my fallout 4 video I thought that was 4 gigs of ram, but read out fully and missing the decimal point. Thanks for the clarification on that. Edit: It says i'm using just under 10 gigs of video memory for 2560x1440P@120hz GTAV Edited September 4, 2016 by johnksss Quote Link to comment Share on other sites More sharing options...
Guest Posted September 4, 2016 Share Posted September 4, 2016 Hmmm. Well, I didn't see anything wrong with it. I didn't watch the entire video, but the part I watched looked smooth enough. More important that that, I just thought is was nice to see someone enjoying ridiculous excess with Titan X 4x SLI. That in and of it self is pretty damned awesome as far as I am concerned. Quote Link to comment Share on other sites More sharing options...
octiceps Posted September 4, 2016 Share Posted September 4, 2016 9 minutes ago, johnksss said: All I said was he got 4 way working. What should have been the topic is 4 way working, not 5 reasons about everything other than the 4 way working. 4-way "works", but it's a terrible experience just watching the video, and I'm not even playing the game. FPS and GPU usage drops all over the place, hitching, massive microstutter whenever the camera is turned, frame time spikes into the hundreds of ms range, etc. 99th percentile frame rate during that run was probably under 20 FPS. This is actually perfect evidence of why SLI is for number-chasers, not for a smooth gaming experience. I'd love to see Digital Foundry get their hands on a Titan XP 4-way and FCAT it so they can tear that guy to shreds. 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.