Jump to content

Recommended Posts

So officially the Asus Maximus V Formula doesn't support Tri-SLI, just Tri-CrossfireX and SLI (2-way). The reason I guess is due to nvidia not being able to officially utilize 4x speeds (e.g. it does 8 x 8). The extreme version of the Maximus V supports tri-sli but that's about $400+ for that board.

One way around it is to use a program called HyperSLI it's a simple install and click program. Just make sure you have a processor that has VT-X (although software emulation is also possible) and 2 mini bridges + 1 long bridge to connect the 3 cards. In my case, I connected 3 GTX 680 cards (2 680 SC signature 2 + 1 680 SC, all EVGA). It doesn't seem like the lack of bandwidth is hurting the performance too much in games or benchmarks on a single monitor at 2560 x 1440. With a triple display, things may change:

3DMARK 11

Performance: NVIDIA GeForce GTX 680 video card benchmark result - Intel Core i7-3770K Processor,ASUSTeK COMPUTER INC. MAXIMUS V FORMULA score: P21377 3DMarks

Extreme: NVIDIA GeForce GTX 680 video card benchmark result - Intel Core i7-3770K Processor,ASUSTeK COMPUTER INC. MAXIMUS V FORMULA score: X10444 3DMarks

GAMES

PICS

post-5-14494994887942_thumb.jpg

Share this post


Link to post
Share on other sites

OC that CPU more! I've run my laptop higher than that. Might want to OC BCLK some for more PCI-E bandwidth since 2 cards are running x4. IVB will get over 10% unlike SB.

Share this post


Link to post
Share on other sites

Will give BCLK a shot although the CPU is at 4.5 GHz 24/7. Any higher requires a big jump in voltage for some reason.

Share this post


Link to post
Share on other sites

Nah... 3770K @ 4.5 is awesome. 4.5 is the sweetspot for 3770K in terms of power, heat and performance, IMHO.

Share this post


Link to post
Share on other sites

My 3570k need 1.2v for 4.5MHz and it has been like that since day one :) it can do 4.7MHz easy but i find my self 4.5MHz very comfortable with low temps.

Share this post


Link to post
Share on other sites

can you not just use a tri-sli bridge instead of two small sli bridges and one long sli bridge?

Share this post


Link to post
Share on other sites
can you not just use a tri-sli bridge instead of two small sli bridges and one long sli bridge?

Nope, length and spacing is off.

Sent from my GT-N7000

Share this post


Link to post
Share on other sites

Yeah but can't you ''bend'' one since they are usually very flexible? That would take care of the spacing.

Share this post


Link to post
Share on other sites
Yeah but can't you ''bend'' one since they are usually very flexible? That would take care of the spacing.

It's a pcb, not cable. Anyway I've got dual titans now.

Sent from my GT-N7000

Share this post


Link to post
Share on other sites

Hi OP,

Just to clarify you are running three GTX680's in SLI with a little bit of software to get the third running. And this works well? I was thinking about doing this when I saw your post. What are the gains on having a third card? What are you loosing due to two of the cards going down to 4x?

Thanks for your help!

Share this post


Link to post
Share on other sites

Gains on a third card are almost non existent, microstuttering could get a smaller problem but thats all. CPU limit is a bad thing ...

Share this post


Link to post
Share on other sites
Gains on a third card are almost non existent, microstuttering could get a smaller problem but thats all. CPU limit is a bad thing ...

Not true, the third card scaled quite well when I ran 3 x 680's in the OP. Microstuttering wasn't an issue I noticed either.

Share this post


Link to post
Share on other sites

Hey Brian im trying to 3 way sli on the same board you had would i be able to do it. Also i can't get a card in the last slot would i need a new case and how did you manage to fit it in the last slot? Thanks for any advise and my specs are listed below.

Specs

Mother board asus v maximus formula

power supply 850 watts

crucial mx100 250 gb ssd

1 tb seagate hd

Blu ray player

cpu intel core 17-3770k

16gigs of ram

Case NZXT Crafted Series Phantom Black/Green Steel / Plastic ATX Full Tower Computer Case

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Similar Content

    • By ounces
      After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

      It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

      GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
      Banding was not present with the original TN panel using the same GPU.
       
      While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
       

       
      My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

      ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
       
      Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
       
      Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
       
      Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

       
      So far, we have two strings for BIOS version:
      80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
      Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
       
      ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

       
      Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

       
      "NVidia inspector" is of the same opinion:

       
      Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

       
      Here we have two three interesting findings:
      Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
      Loading the same file in "Kepler BIOS tweaker" confirms the facts:

       
      EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
      ...Shall I be worried about "60.02Hz" refresh rate?
       
      To summarize:
      Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
      Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
      K4000M.zip
    • By Blocker35
      Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
       
      - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
      - Razer X Core 
      - Apple TB3 to TB2 Adapter
      -  TB2 Cable (Cable Matters)
      - 23inch AOC External Monitor
       
      I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
      I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
      Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
    • By Kel
      I got the exp gdc beast 8.5 got all the connections right... laptop boots up... fans turn on the gpu.... device manager detects gtx 1060 with a “ ! “ in a yellow triangle. Now what to do.... i tried to do the bios mod.... but realised that i have 3.07 version. 
      Questions-
      1 is there any way to get this done without the mod?? 
      2 if mod is the only way... kindly help me with step by step instruction in laymans terms... i m not a techy... so i might not understand technical terms.... altho i am interested... but learning yet. Please help. 
    • By thechillhacker
      Hello all
       
      I have a Clevo P370EM3 laptop with a pair of GTX 680Ms in SLI mode and have never been able to get Linux to work properly with full acceleration and without tearing. I have tried every xorg config I can find, and even all the modifications from the nvidia configurator to no avail. Has anyone been able to get these things to work 100% in Linux? I have typically used ubuntu based distros, including Mint, and every available driver version from the repositories, but can never get it to work quite right.
       
      Thanks for any help anyone can provide. It is killing me to run Windows on this machine because of these issues, and I am almost to the point of installing the inferior 7970 cards.
       
      Have a great day
      -TCH
    • By jhsd1124013561
      Help! I created a BOS BOOT DISK and use legacy to boot in DOS.   But it is saying that "An operating system wasn't found".   I have extracted all the files (the alienware m18x r2 a11 unlocked bios) into my external hard drive (format: NTFS)
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.