Founder StamatisX Posted July 22, 2011 Founder Share Posted July 22, 2011 I was reading an article about Nvidia having better image quality than AMD for the same settings. The reason I decided to create this thread though is after reading this review and particularly that specific partVerdict What is left to say about the GeForce GTX 580M? Yes, this graphic card is without a doubt the new king of performance, and, yes, thanks to its various features and the better picture quality, the graphic card can distinguish itself from its main AMD competition. and since we have forum members that have tried both cards on their laptops (ie 460M), I was wondering if you noticed the same thing. Link to comment Share on other sites More sharing options...
Granyte Posted July 22, 2011 Share Posted July 22, 2011 notebook cheq and nvidia's own website need i to say more ahahahhaahwe would need people with both card that could take picture of the exact same positions with 2 diferent cards for anyone to be convicend and also to shut fanboys arguments(note i assume mys elf as an ATI fan) Link to comment Share on other sites More sharing options...
Founder StamatisX Posted July 22, 2011 Author Founder Share Posted July 22, 2011 I know for a fact that our moderator @svl7 has both 460M and 6970M on his M15x so he could shed some light on this Link to comment Share on other sites More sharing options...
iloveb00bs Posted July 22, 2011 Share Posted July 22, 2011 i have used the 260, 460,470 from nvidia5850,5870,6970 from ati honestly i really can't tell the difference in image quality at all. Link to comment Share on other sites More sharing options...
Founder StamatisX Posted July 22, 2011 Author Founder Share Posted July 22, 2011 At this point it would be useful to play a game at lets say 1080p all settings maxed out and take a few screenshots and a video recording with both cards so we can see if there is some noticable difference between them. Link to comment Share on other sites More sharing options...
Granyte Posted July 22, 2011 Share Posted July 22, 2011 yup cause evne when comparing 460 vs 6970 because 6970 >>>>>>>>>>>>>>>>>460 in power the simple fact that it gives more fps would make the amd seem better at the same seting cause it would be smootherunles we are comparing the price/image qualitie ratio then we obviously know who wins LOOOL ok i stop now /fanboyism Link to comment Share on other sites More sharing options...
Founder StamatisX Posted July 22, 2011 Author Founder Share Posted July 22, 2011 I want to be subjective in this matter and I believe this is something worth investigating since we all know that drivers can make a huge difference in performance. So my basic question is if AMD is sacrificing quality over performance compared to Nvidia. Think about it as an image that you want to save as jpeg. At this point you can select how much information you want to loose in order to achieve a smaller size, so in our case what settings does each company apply in order to balance performance over quality?I wish I had GPUs from both sides to do some testing but unfortunately I am limited to ATI. Link to comment Share on other sites More sharing options...
iloveb00bs Posted July 22, 2011 Share Posted July 22, 2011 this is an interesting article, will make me take a closer look at image quality, i will see if i can get my hands on another nvidia card to test. Link to comment Share on other sites More sharing options...
svl7 Posted July 22, 2011 Share Posted July 22, 2011 I know for a fact that our moderator @svl7 has both 460M and 6970M on his M15x so he could shed some light on thisYeah, almost... I had the 470m in the system for a while, though I haven't it anymore, sold it. I didn't use it a lot, but vantage does certainly look a bit different than with the 6970m, whether it's better or not is hard to tell since I didn't have the possibility to compare it directly, but there's definitely a certain difference, colors, textures, shadow... hard to describe.It's totally normal that AMD and Nvidia don't deliver identical pictures, different cards handle and process the data in a different way, there must be certain variations in colors for example, also when rendering the frames of a 3d application, I'm not GPU expert, not at all, but there since the cards have a completely different architecture the results are bound to be varying. I can't tell which card is better, but I'd say for gaming purpose the color differences for example are negligible, and the faster card will definitely provide a more satisfying gaming experience.If it comes to professional photo/video editing it's much more important to have the colors displayed as accurately as possible, but this is very dependent on the screen and the color calibration, I think this is more important than the GPU, but I might be wrong, as I said, I'm not an expert at this subject. If I remember this correctly, Nvidias Quadro cards and AMDs FirePro GPUs are not only optimized for CAD work, but also for color accuracy. Haven't read the article yet, but as it got posted on the Nvidia blog I'd say it is probably a bit biased.Btw. I have a 260m and 5850m laying around as well... but it's pretty much impossible to really compare the picture quality of the cards with only one system, otherwise I'd do some testing, would be interesting to see. Link to comment Share on other sites More sharing options...
svl7 Posted July 22, 2011 Share Posted July 22, 2011 I've just read the article on Nvidias blog, and to be honest I think it's not such a big thing, the Nvidia control panel is very different from the CCC, and it has indeed higher standard settings as far as I remember, but power users adjust the settings anyway so that they fit their needs (no matter whether there's an AMD or Nvidia card in the system) And even if you don't change the standard settings, a lot of in-game graphic settings overwrite the CCC/NCP settings anyway, so I don't think that's a big problem. Lower quality settings is definitely an advantage for benchmarks, but most people tweak the settings for benching since they want to score as high as possible. At least Nvidia didn't forget about their crazy tweaks in the past "For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality." And well, we can't be 100% sure whether there aren't any hardware tweaks (for boosting benchmark performance) implemented in the GPU itself, I think Nvidia and AMD as well have both some tricks for this.Even though good drivers are crucial, the possibilities of software optimization are limited, a lot of things are hardware-bound and depend on the architecture of the GPU, not every architecture supports the latest AA modes and some are better at Tesselation than others. I think this is more interesting than the standard software settings of the GPU control panel, they can get adjusted, whereas the architecture of your GPU can't. What's really interesting imo are (besides the overall performance) the post-processing capabilities of the GPU, this can make a huge difference in how a game looks and feels, and there are big differences between different GPUs regarding the quality of depth of field, shadow mapping, HDR effects etc. Way more important than the standard settings of the GPU control panel, just my 0.02 2 Link to comment Share on other sites More sharing options...
kune Posted August 13, 2011 Share Posted August 13, 2011 In all honesty I noticed that DA2 looked exceptionally better in going from 460M -> 6970M (meaning I found AMD/ATi image quality better...at least in that game) at the same settingsThat's basically the only game I played on 460M Link to comment Share on other sites More sharing options...
Ninjahunter Posted August 13, 2011 Share Posted August 13, 2011 I remeber there was a site(tweak guides i believe) a couple (or more) years ago that did a comparison and the only NOTICABLE difference they found was that ATI has an HQ AF feature. But this was a long time ago. Link to comment Share on other sites More sharing options...
mw86 Posted April 10, 2012 Share Posted April 10, 2012 Honestly I used both 5870, 6970m and 580m and when I game i turn off Anistropic optimizations, bilinear/trilinear optimization, I tell it to clamp on the negative lod bias to preserve full image quality in all those and I set textures to High-quality this is in both CCC and Nvidia Control panel. Its hard to tell but it seems both set like that textures looked better on the Nvidia card like just a smidgen sharper like on AMD it felt like I wasn't able to turn all optimization features off although all available were... now something not in AMD in a game or two if I look carefully like the Witcher 2 the Nvidia textures have light hard to see striations or lines its hard to describe and perhaps has to do with the differences of AMD and Nvidia's processing. When I was using a rig years ago to play Morrowind in a similar manner... using high quality on an older geforce the water textures were sharp but on a comparable AMD of the time using all high quality the water would always show as a lower res texture than the Nvidia. But I am very skeptical if truly such differences really do exist now and going by memory doesnt help compare for any of us as the ILOVEBOOBS would be guru of image quality differences being he has had a ton of both companies GPUs... so we really need some people to take screen shots of a preset resolution and in the same games with the game settings shown in a pic and the video cards options shown in an additional pic. Then we can compare scenes from one to the other with PhysX not enabled of course... Link to comment Share on other sites More sharing options...
Recommended Posts