Jump to content
Sign in to follow this  
TrojanTheGod

The New Nvidia GTX 1070

Recommended Posts

So guys, what's your thoughts on the new latest and greatest from Nvidia. According to benchmark results, posted by Nvidia on their official site and released at the conference, the GTX 1070 pulls slightly ahead of the Titan X, which is a pretty good feat. considering that it costs $359, will you guys be settling for this or the 1080?

Share this post


Link to post
Share on other sites

I have the GTX980 and will probably keep using it for a while before I pull the trigger on either the GTX 1070 or 1080. I only had the 980 for about 7 months.

Share this post


Link to post
Share on other sites

If coming from an older card, like a 660, is it worth the extra money to go to a 1080 or is the 1070 going to be good enough for value/upgrade performance?

Share this post


Link to post
Share on other sites

I currently have a 970 in my desktop and even though my monitor is 4k I mostly play stuff in 1080p so I'm not sure it's worth me upgrading quite yet. I'll keep up with the chatter and possibly move over to a 1080 and 4k resolution in a year or 2.

Share this post


Link to post
Share on other sites

I'm planning my new rig and was waiting for the GTX 1070, but pricing in € is just ridiculous. I will wait a few weeks for tests of custom designs and watch if the prices will drop after AMD Polaris is released.

Nonetheless there are any reliable tests of Polaris atm, I'm considering a Crossfire platform of two r9 480/480x or higher. I'm a fan of Nvidia and thought I would never buy AMD, but the additions (also for g-sync monitors compared to f-sync), if there is no benefit in performance, let me think of switching sides and invest the saved euros in an PCIe SSD or whatever... maybe some beers ;-).

Share this post


Link to post
Share on other sites

The $250 GTX 1060 is almost here. (July 19th) and still way more cheaper, more performance and more efficient than a 980.

But i guess i will go for the 1070 in the next couple of months, to see if theres any significant issues on the first build or not.

Founders Edition is as gorgeous as the Reference card design

01d2a658cccdd84a467565a00384a001.jpg

2924aa1e218479b10f9f57d0d69edf03.jpg

Share this post


Link to post
Share on other sites

The price of 1070 is gone above $400 waiting for the price to go down as the quoted MSRP of $350.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Similar Content

    • By Sparkhunter
      Hi there guys,
       
      I've been looking into getting a new GPU for my laptop (4-lane PCI-e) and enclosure (HP Omen Enclosure) but I cannot decided between the RX 5700 XT and the RTX 2070 (non SUPER).
       
      I did some price to performance calculations (higher is better) with some of the best prices I can currently get in the UK.
       

       
      The RX 5700 XT does perform better and is cheaper than the RTX 2070 (Although the 2070 does come with 2 games) but my main concern is drivers and from what I've heard NVIDIA cards work a lot better with eGPUs than AMD.
       
      If anyone has experience using both red & green or if anyone else has any other input i'd love to hear your experiences and input.
       
      Thanks.
       
    • By Solaire
      Hello everyone,
      some time by now I flashed a custom Vbios (thank you Klem) on an Acer Predator 15 G9-591 and I started doing some tests about how far I could go overclocking the gtx 970m without running too hot. With the voltage up to 1150mV and something like 300 additional MHz for the core clock, in the strongest stress condition for both Gpu and Cpu ( i7-6700hq undervolted), the Gpu reaches a max temp of 85C (with some 86 spikes but never more than that).
      Considering that soon I'm going to relid both Cpu and Gpu and clean everything in the internals (laptop is 4 years old) I think I'm going to have some additional heatroom.
      I've already tested some further overclocking and I noticed that even if the temperatures remains under 93C (which is just for testing purposes, but after relidding temps could be nicer) graphical glitches occur and after some time most games crash. But my question is, could it be because of lack of power supply? The laptop charger provides 180W 
      Could there be an increase in overclock margins with a 230W psu or something like that? (Obviously with the same 19.5V output)
      If anybody tried something like that on any laptop model, or knows the matter, I'd like to know
    • By duffman_777
      Managed to find a relatively cheap MSI 1070 on ebay, and after reading a few success stories that 1070's are compatible I decided to pull the trigger. 
       
      Modified the bottom case as per the photo on this FAQ page http://www.eurocom.com/ec/faqs(272)ClevoP150EM_SagerNP9150_XMG_P502_PRO. Things were going well until I went to screw the card down and felt some resistance.
       
      Oops. Looks like that capacitor(?) with the blue on it *just* clashes with the card. Now what's weird is not all P150em motherboards seem to have this component on it - looking at replacements this component is not there. Also looking at the photos from this 1080m P150em mod (https://premamod.wordpress.com/2017/10/10/clevo-pascal-mxm-standard/), this component isn't there either. Does anyone know what this is, and if it can be removed? Or, if this could be taken off and shifted out of the way somehow, or replaced with a smaller equivalent...?
       
      In my disbelief/annoyance/problem solving when trying to push it down, I did notice that the card has a surprising amount of flex in it and it *can* be screwed down almost completely to a level where I feel somewhat comfortable in doing... Though I'd rather not put this strain on the card if there is a way to avoid it.
       
       
       
       
       



    • By ounces
      After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

      It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

      GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
      Banding was not present with the original TN panel using the same GPU.
       
      While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
       

       
      My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

      ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
       
      Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
       
      Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
       
      Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

       
      So far, we have two strings for BIOS version:
      80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
      Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
       
      ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

       
      Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

       
      "NVidia inspector" is of the same opinion:

       
      Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

       
      Here we have two three interesting findings:
      Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
      Loading the same file in "Kepler BIOS tweaker" confirms the facts:

       
      EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
      ...Shall I be worried about "60.02Hz" refresh rate?
       
      To summarize:
      Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
      Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
      K4000M.zip
    • By Blocker35
      Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
       
      - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
      - Razer X Core 
      - Apple TB3 to TB2 Adapter
      -  TB2 Cable (Cable Matters)
      - 23inch AOC External Monitor
       
      I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
      I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
      Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.