• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia announces RTX 2060, more powerful than GTX 1070 Ti at $350

Well I'm not defending the price but I currently own a 670 with 2GB VRAM so for me that it a big upgrade :) how much VRAM do I need? Is there a big difference between having 6 and 8? I think I can tweak it down to avoid hitching so it'll be fine.
 
The VRAM thing is being blown way out of proportion, just as it was with 4GB Vs 6GB, 4 GB Vs 3GB and 3GB Vs 2GB etc.
Bottom line is this is a fourth or fifth tier card(if you include the Titan) anyone saying it doesn't have enough memory for the job it is aimed at, in my opinion doesn't understand who this card is aimed at.;)
 
The VRAM thing is being blown way out of proportion, just as it was with 4GB Vs 6GB, 4 GB Vs 3GB and 3GB Vs 2GB etc.
Bottom line is this is a fourth or fifth tier card(if you include the Titan) anyone saying it doesn't have enough memory for the job it is aimed at, in my opinion doesn't understand who this card is aimed at.;)

May be for you, but my 980GTX would run out of memory frequently in all sorts of games at medium details at 1440p with 4GB, hence I replaced it with a 1080Ti. I have checked memory consumption and I am over 6GB in more than one game so 8GB should really be the minimum, unless you play at 1080p, otherwise your minimum framerates will be awful.
 
Well I'm not defending the price but I currently own a 670 with 2GB VRAM so for me that it a big upgrade :) how much VRAM do I need? Is there a big difference between having 6 and 8? I think I can tweak it down to avoid hitching so it'll be fine.

If you game at 1440p I would not get a card with 6GB, 8GB should be the minimum.
 
I game at qHD and see games pushing between 6 to 8GB VRAM already - someone buying a card right now in 2019 is going to probably keep it for a few years so ideally you want some more leeway with VRAM. Moreover,all the older Nvidia cards and current AMD cards of SIMILAR performance have 8GB of VRAM and there is quite a number of them,so will devs limit games to 6GB for the next few years?? You are taking a risk IMHO and if you are spending someone elses money its not a promise I would be making.

If you are gaming at 1080p then its more than enough though.

Also,looking at HardOCP DXR BF V benchmarks,the game uses a ton of VRAM,and it seems to be something I noticed with DX12 - more VRAM usage,so that is going to be another consideration if you want to use raytracing.

Even Anandtech and Eurogamer says its a solid card overall,but had the following to say too:

AT said:
There are hints that the 6GB framebuffer might be limiting, especially with unexpectedly low 99th percentile framerates at Wolfenstein II in 4K, though nothing to the extent that older 4GB GTX 900 series cards have experienced.

AT said:
6GB is a bit more reasonable progression compared to the 8GB of the RTX 2070 and RTX 2080, but it is something to revisit if there are indeed lower-memory cut-down variants of the RTX 2060 on the way, or if games continue the historical path of always needing more framebuffer space. The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.

Eurogamer said:
There are some further takeaways from the Battlefield 5 RTX experience, and some of them are reminiscent of the arrival of Crysis back in 2008. The importance of VRAM is a consideration here. Just like Crytek's epic back in the day, the arrival of new technology comes with big hikes in system requirements - the evidence of our testing does suggest that overloading framebuffer memory is easily doable at 1080p, and requires some tweaks.

My viewpoint is if you want to play games at 1440p,get the RTX2070,especially if you can find one for around £450. Its a bit quicker than my GTX1080,and at least has extra raytracing performance too compared to the RTX2060.
 
VRAM is a fairly easy thing to adjust for though, nearly every game I've come across that can use 6GB+ VRAM has a textures setting that has a strong correlation to VRAM usage.
Don't get me wrong, my both my current and previous cards have over 6GB VRAM and I have seen games use it, but that's because I'll have textures on ultra or whatever.
I'd much rather have a powerful card with the potential to bottleneck on VRAM when using inappropriate settings, than card that just simply lacks the grunt to push frames regardless of how much memory it has.
 
VRAM is a fairly easy thing to adjust for though, nearly every game I've come across that can use 6GB+ VRAM has a textures setting that has a strong correlation to VRAM usage.
Don't get me wrong, my both my current and previous cards have over 6GB VRAM and I have seen games use it, but that's because I'll have textures on ultra or whatever.

The issue is what happens over the next few years. The Ultra texture settings of now become the very high or high in 12 to 24 months time and if you are already turning down settings now on a £330 to £400 card its not going to get any better and 2 to 4 years lifespan is not unreasonable for such a card.

We will most likely have a PS5 late this year or early next year,so that will mean another push to graphics quality in games. Something like 1440p is twice the number of pixels as 1080p. Cards with RTX2060 like performance like the GTX1070/GTX1070TI/GTX1080/Vega56/Vega64 have been out for ages and all have an 8GB framebuffer. It make sense devs would target an 8GB framebuffer for the future next gen titles.

After all we are forgetting one thing here - the 60 series cards are traditional mainstream cards,aimed at mainstream resolutions,ie,1080p. Resolutions like 1440p and 4K are not considered mainstream yet(look at the stats on monitor resolutions) even though pricing has dropped a lot.People are forgetting the RTX2060 is the most expensive 60 series card in years,and Nvidia probably only considers it a true 1080p especially for its future RTX enabled games. Hence why it has a 6GB framebuffer.

Its priced at an entry level enthusiast price,but is not probably considered an enthusiast level card. The RTX2070 by its very name and the fact it has 8GB of VRAM itself is considered that card.

I'd much rather have a powerful card with the potential to bottleneck on VRAM when using inappropriate settings, than card that just simply lacks the grunt to push frames regardless of how much memory it has.

You mean like the 8800GT 256MB,which ended up failing so much within 12 to 24 months,that the slower 9600GT 512MB ended up being a better card??

People made those arguments back then,and sure it was an extreme example, but there has been a few instances even going back 16 years I can remember cards could be really limited by VRAM. For instance some of the special edition ATI 9800 series cards in prebuilt PCs which shipped with only half the VRAM of retail versions.
 
Last edited:
I just don't know why you'd buy a 6GB card for £350, it makes no sense, even at 1080p, who knows what the near future will bring? why spend £350 just to lower settings and textures? It's insane. At 1440p buying a 6GB card is complete lunacy. You can buy a 8GB card for £200, so why would a much faster card be saddled with less VRAM?

The 2060 is such a dumb card, it's a hardware mis-match, buy a 2070 or buy Vega, but don't buy a 2060. If 6Gb is enough today and that's highly debateable it's not going to be enough very soon unless you enjoy playing at lower settings, in which case you may just as well have bought a card for a lot less money to begin with.
 
That's the thing, people don't want to buy a £350 card with only 6GB of Vram, then don't, either spend more for NVidia 2070 or upwards, or go for AMD currently at very reasonable prices.

The 6 GB of ram isn't really the problem, its the price that is wrong, it is too expensive for that tier of card. All the other RTX cards are available for a good deal less than the FE price, but not the 2060, the FE is the cheapest by quite a bit.
 
Feels pretty bad that I bought my GTX 1070 for less 2 years ago than I can buy it for now even taking the EOL sales into account.

This is exactly why I have not bought a new card in such a long time, just felt like we were being ripped off due to crypto inflation and Nvidia trying to cash in as much as possible. (I still think Nvidia's card look a hell of a lot nicer for the Ref models).
 
It'll be interesting to see the sales figures for the RTX 2060 and how many NVidia have produced going by their predictions for demand. It's got the mainstream **60 tag, but its target segment is pretty much a niche who wants ray tracing at strictly 1080p, has a £300+ budget for a graphics card, and yet would never consider upgrading to a 1440p+ monitor.

It's probably a handy card for including in pre-built systems - companies get to market it with the tag of an "RTX 2xxx PHWOAR series" for the lemmings who might otherwise turn up their nose at a 1080/1080Ti because its not 'latest' enough.

An 8 GB RTX 2060 at that price point would be reasonably game changing, but then it'd undermine not just the Vega56/64/GTX 1070/Ti/1080, but also the RTX 2070.

Makes me wonder if NVidia's strategy is to keep on with the 1060 for mainstream 1080p gaming, with the pricing point and lack of VRAM on the 2060 essentially being used to coerce anyone intent on upgrading into considering the 2070, and therefore on to the exponential price curve of the 2070-2080-2080Ti. If they're only making 2060s in relatively limited quantities, that would suggest to me that might be the case.
 
The Bottom Line


We are going to say this up front and get this out of the way, a $349.99-$389.99 video card has no excuse for only having 6GB of VRAM capacity in 2019. At this pricing and cost you should expect no less than 8GB of VRAM.
The GeForce GTX 1070 had it, the GTX 1070 Ti had it, the AMD Radeon Vega 56 had it, all those video cards are close to this price range. If the RTX 2060 was around $250 like the GTX 1060, it would make more sense, but at $350 no way. This limitation does bottleneck NVIDIA Ray Tracing performance in Battlefield V.

The GeForce RTX 2060 does lack the NVIDIA Ray Tracing performance of its bigger brothers. There is a large difference in performance between it and the RTX 2070. Out of all the RTX cards, the RTX 2060 presented the most negative impact on Battlefield V performance when enabling NVIDIA Ray Tracing. It is a joke to call the RTX 2060 an NVIDIA Ray Tracing card, and then use Battlefield V to show that off. Some would say RTX should have only been enabled for the RTX 2080 and 2080 Ti currently. We can certainly see how one could conclude that based on RTX 2070 and RTX 2060 NVIDIA Ray Tracing performance in Battlefield V. If you can’t use NVIDIA Ray Tracing due to performance in the first place, then what’s the point?

If the "RTX is a lie," like we see it to be overall, then the RTX 2060 is an outright dishonest fabrication when it comes to ray tracing and Battlefield V. It just doesn’t work, and Jensen should be ashamed for telling us all that it does.



Ouch, the 2060 is a very poorly realised GPU.
 
Back
Top Bottom