Jump to content
Brian

Do you plan to purchase an NVIDIA Turing GPU (RTX 2070/2080/2080 Ti)?

NVIDIA Turing RTX GPU  

141 members have voted

  1. 1. Do you plan to purchase an NVIDIA Turing GPU (RTX 2070/2080/2080 Ti)?

    • Yes I plan to purchase one of the new RTX GPUs or have already put in a pre-order for one.
    • I'm waiting on reviews/benchmark analysis before I make a decision.
    • I'm going to hold on to my current graphics card and wait on the 7nm cards to be released.
    • I don't plan to upgrade my GPU anytime soon.
    • I would upgrade but the price is too expensive so I can't afford it.
    • I want to see what AMD releases with their 7nm Navi.


Recommended Posts

So with the new Turing GPUs having been announced and up for purchase, do you plan to upgrade to one given the price increase of each segment? How important are the new ray tracing and dlss functions in making a purchasing decision for you? 

Share this post


Link to post
Share on other sites

I'm building a new PC so I need to get something, unfortunately.

 

This was an obvious strategy by nVidia to try to sell off their glut of 10 series cards by pricing RTX like they did.  1080ti prices have actually been going up, since those are in higher demand now.  I'll probably end up getting an RTX 2080 because at this point it's only marginally more expensive than a decent 1080ti.

 

Oh, and this whole "founders prices" stuff needs to stop.  We will never see RTX cards sell for the prices nVidia stated in their press conference ($699/$999).  All the partner cards are releasing at the founders prices, and there's no reason for nVidia to drop prices with no meaningful competition in the market from AMD or anyone else.  AMD doesn't have anything to compete with either RTX card or the 1080ti for that matter.

  • Thumbs Up 1

Share this post


Link to post
Share on other sites

I have been using AMD for a while now got twin MSI Gaming x RX480 Radeons with a heavy overclock on a custom water cooling loop been loving the cards they are great not only for the pocket but if you have time up your sleeve you can fine tune and hit a really nice sweet spot.. Back on topic i don't feel it's necessary to spend over a grand on a GPU i would wait and see what AMD come out with at least you get some really decent performance and not gonna hurt you'r pockets.

Share this post


Link to post
Share on other sites

I'm sitting running my Gigabyte HD7970, which is still chugging along in most games I play.

 

Unless I see an actual need to upgrade(or my computer blows up) then I'll be sticking around with my 2012 build for a couple more years.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Similar Content

    • By Sparkhunter
      Hi there guys,
       
      I've been looking into getting a new GPU for my laptop (4-lane PCI-e) and enclosure (HP Omen Enclosure) but I cannot decided between the RX 5700 XT and the RTX 2070 (non SUPER).
       
      I did some price to performance calculations (higher is better) with some of the best prices I can currently get in the UK.
       

       
      The RX 5700 XT does perform better and is cheaper than the RTX 2070 (Although the 2070 does come with 2 games) but my main concern is drivers and from what I've heard NVIDIA cards work a lot better with eGPUs than AMD.
       
      If anyone has experience using both red & green or if anyone else has any other input i'd love to hear your experiences and input.
       
      Thanks.
       
    • By Solaire
      Hello everyone,
      some time by now I flashed a custom Vbios (thank you Klem) on an Acer Predator 15 G9-591 and I started doing some tests about how far I could go overclocking the gtx 970m without running too hot. With the voltage up to 1150mV and something like 300 additional MHz for the core clock, in the strongest stress condition for both Gpu and Cpu ( i7-6700hq undervolted), the Gpu reaches a max temp of 85C (with some 86 spikes but never more than that).
      Considering that soon I'm going to relid both Cpu and Gpu and clean everything in the internals (laptop is 4 years old) I think I'm going to have some additional heatroom.
      I've already tested some further overclocking and I noticed that even if the temperatures remains under 93C (which is just for testing purposes, but after relidding temps could be nicer) graphical glitches occur and after some time most games crash. But my question is, could it be because of lack of power supply? The laptop charger provides 180W 
      Could there be an increase in overclock margins with a 230W psu or something like that? (Obviously with the same 19.5V output)
      If anybody tried something like that on any laptop model, or knows the matter, I'd like to know
    • By ounces
      After spending significant time and effort to obtain "DC" screen for 8770w (which is essentially a regular IPS panel with fancy board that converts 8bpc LVDS to 10bpc DP), I have finally got and installed one. All works great, except of the one problem...

      It has pretty bad banding / posterization in lower shadows. I have tried profiling it in different modes (full range, sRGB, rec709) - issue persists, and it indeed shows only in the lowest part of the characteristic curve. Mids and highlights are represented fine and show low deviation from reference values.

      GPU is HP K4000M, Nvidia drivers installed "as it is", video-card is identified without a hitch.
      Banding was not present with the original TN panel using the same GPU.
       
      While checking a software side, I have noticed that Win10 has bit depth set to 8-bit...
       

       
      My initial reaction was, - "Easy, let's change it in `nvidia-settings` and we're all set":

      ...but that would be too easy, right? After selecting 10bpc and clicking "Apply" screen went off and back on, only to show that depth stayed at 8bpc. Repeating the above few times yielded exactly the same result and I'm not in a hurry to meet a cliched (and laymen) definition of insanity.
       
      Let's check GPU-Z. So far so good, nothing unusual. Notice the highlighted BIOS version and subvendor string:
       
      Time to delve into other tabs. We are running WDDDM v2.4 which supports GPU dithering, but hey... BIOS version has changed!
       
      Briefly back to `nvidia-settings` to check what is reported by vendor's own utility:

       
      So far, we have two strings for BIOS version:
      80.04.5A.00.02 (let's call it an "A") 80.4.33.0.37 (let's call it a "B")  
      Notice how 2nd one seems to not follow hexademical notation. Lastly, "NVIDIA BIOS" drop-down, reports "A" version:
       
      ...and monitor section which confirms that rig is indeed capable of 10bpc, but currently running at mere 8bpc:

       
      Windows "Adapter settings", reports version "B". It's 2019, diversity is a must.

       
      "NVidia inspector" is of the same opinion:

       
      Now, let's use some seriously legit tools and check-in exported BIOS file in `nvflash`:

       
      Here we have two three interesting findings:
      Reported vendor is Dell, not an HP. See this link for details. BIOS version is back to "A". Have I already mentioned diversity? MXM module uses MX25L2005 flash storage in WSON-8 packaging. If things go real nasty, we should be able to rescue a patient via Pomona clip and external programmer.  
      Loading the same file in "Kepler BIOS tweaker" confirms the facts:

       
      EDID settings, courtesy of NVidia Control Panel. Hex dump can be found at the bottom of this post.
      ...Shall I be worried about "60.02Hz" refresh rate?
       
      To summarize:
      Why two different BIOS versions are reported? Anything to do with UEFI (e.g. HP is sideloading its own during boot)?.. Why two different vendors reported? As far as I remember, this is branded HP GPU. Where to get "clean" BIOS of K4000M for future experiments? Ideally from 8770w equipped with "DreamColor" panel from a factory.  
      Link to the dumps, BIOS ROM and monitor EDID: https://mega.nz/#F!zGgRmQIL!9q2QFZtHuK2RQ-WHXMA4Mg (also attached to this post)
      K4000M.zip
    • By Blocker35
      Hi guys, bit of a rookie to the whole EGPU scene. Currently I have:
       
      - MacBook Pro 2015 13inch (3.1GHz Core i7 16GB RAM)
      - Razer X Core 
      - Apple TB3 to TB2 Adapter
      -  TB2 Cable (Cable Matters)
      - 23inch AOC External Monitor
       
      I am wonder about what graphics card to get to run X-Plane11 with high graphic settings?
      I have purgewrangler set up and ready to use with an AMD graphics card, but am also open to the idea of an Nvidia graphics card.
      Any advice appreciated. I did not buy a Windows PC as I need a Mac for various other things and wanted an all-in-one laptop.
    • By captcavy
      Hello everyone,, I need your expertise, I have a m17 r4, (2012) unlocked A15 bios thanks to Klem, it has a Gtx 675m in it now, 
      I would like to know the best GPU to upgrade to without losing a lot of the factory functions of the alienware laptop, like being able to use both VGA functions such as intel 4000, and the factory bios for the cooling of the GPU, I don't want to solely rely on the nvidia
      As a full dedicated GPU, let me know what VGA card to get and what vbios to use, I will gladly donate beer to you for your help, or point me to a member on this forum who can help me,
      Thanks, Shelby,  Aka captcavy
×

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.