Jump to content

NVIDIA GeForce GTX880M SLI 16GB Problems


Recommended Posts

I'm getting extremely frustrated with myself not being able to find the right answers to my problem. But before I continue, I greatly apologize if I am posting in the wrong area. I just joined up a few days ago.

Here are my system specifications.

Alienware 18

Windows 8.1 Pro

Intel® Core i7-4810MQ CPU @ 2.80GHz (8CPUs), ~2.8GHz

16384MB RAM

DirectX 11

NVIDIA GeForce GTX 880M 16188 MB

CUDA Cores 1536

Graphics/Core Clock: 954 + Boost

Memory Clock up to 2500 MHz

Standard Memory Configuration GDDR5

Memory Interface Width 256-bit

Memory Bandwidth (GB/sec) 160.0

FEATURE SUPPORT: NVIDIA BatteryBoost, Optimus, GeForce Experience, FXAA, TXAA, PhysX, Direct Compute, GPU Boost 2.0, Blu-ray 3D, NVIDIA SLI-Ready

OpenGL 4.5

OpenCL 1.1

Bus Support PCI Express 2.0, PCI Express 3.0

LINK TO GRAPHICS CARD: GeForce GTX 880M | Specifications | GeForce

My problem is that I feel as if my graphics cards in SLI should be performing way better than the performance I am getting. I've benchmarked with 3D Mark and scored above 7000 the first time and then the second time above 8000. Both times I had a notification saying that the graphics drivers were not approved. I have the latest ones as of now which is:

Release date: 6/22/2015 Version: 353.30

I've been digging around on google and found a link that said something like "Extremely frustrated with GTX880M". I clicked on it to see what was up considering my same frustration, the guy said something about flashing the BIOS. I don't need to do that. I went to DELL website for the drivers and I didn't need to update it. I figured overclocking should do the trick so I downloaded the performance tab for the NVIDIA control panel. I moved the graphics clock on both cards from 954>980>990>1000>1200>1350>1500>1700>1800 (1908 MHz was the highest for the graphics clock but I never went higher than 1800 MHz and I left the memory at 2500Mhz on both cards, did not touch, as it gives me BSOD) and I played Borderlands 2 as my test game for each time I overclocked my card. I don't think it worked at all. I made sure I applied the new settings and yes I game with the charger plugged in. All settings are on high PhysX, Anisotropic, 1920x1080, you name it. I experience low frame rates on heavy gun fights. Now, knowing for a fact that my cards in SLI (yes i made sure I'm in SLI through NVIDIA control panel) give my cards 16GB in total for video memory, why am i still experiencing low fps drops to around below 30 rarely 19? When I run Crysis 2, I have everything on Ultra settings (Ultra is higher than Extreme in Crysis settings) and it plays amazingly, and then when I benchmark on Heaven Benchmark 4.0, I get low fps but sometimes high fps that goes to 142.6 fps. (keep in mind to that I closed all unnecessary programs like steam, geforce expreience, origin, dolby digital, also the heaven settings were on ultra settings, no tesselation, 1920x1080 fullscreen, disabled Stereo 3D, disabled Multi-monitor, and disabled Anti-aliasing). Also before I forget, I've overclocked my cpu through the BIOS. Max speed before was around 3.78GHz and after overclocking the max speed is around 4.17GHz and here are my current Manage 3D Settings for my GeForce GTX880M's.

SLI configuration: Maximize 3D performance

PhysX settings Processor: Auto-select (recommended) PhysX>GeForce GTX880M (2)

Ambient Occlusion: Off

Anisotropic filtering: Application-controlled

Antialiasing - FXAA: Off

Antialiasing - Gamma correction: On

Antialiasing - Mode: Application-controlled

Antialiasing - Setting: Application-controlled

Antialiasing - Transparency: Off

CUDA - GPUs: All

DSR - Factors: Off

DSR - Smoothness: Off

Maximum pre-rendered frames: 4

Multi-display/mixed-GPU acceleration: Single display performance mode

Power Management mode: Prefer maximum performance

SLI rendering management mode: Force alternate frame rendering 2

Shader Cache: On

Texture filtering - Anisotropic sample optimization: On

Texture filtering - Negative LOD bias: Allow

Texture filtering - Quality: High performance

Texture filtering - Trilinear optimization: On

Threaded optimization: On

Triple Buffering: Off

Vertical sync: Adaptive

Virtual Reality pre-rendered frames: 1

I've also read around that NVIDIA is "crippling" our cards. With every driver update our fps are getting worst and worst. Is there any possible way that my GeForce GTX880M's in SLI will work at their full potential? If there is, kindly let me know step by step. I am a noob and I know that I don't know what I'm doing. Any help is greatly appreciated.

Thank you for reading my post.

Link to comment
Share on other sites

whats your score on the fire strike 1.1 ? i have the same laptop but with 4910mq i got max 9400 point and im using svl7 vbios and i think u might have a heat problem try using rivatuner while benchmarking and see if u get any drops with your clocks or heat

Link to comment
Share on other sites

It's in my post:

I've benchmarked with 3D Mark and scored above 7000 the first time and then the second time above 8000. Both times I had a notification saying that the graphics drivers were not approved. I have the latest ones as of now which is:

Release date: 6/22/2015 Version: 353.30

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.