• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia announces RTX 2060, more powerful than GTX 1070 Ti at $350

Man of Honour
Joined
25 Oct 2002
Posts
31,735
Location
Hampshire
The issue is what happens over the next few years. The Ultra texture settings of now become the very high or high in 12 to 24 months time and if you are already turning down settings now on a £330 to £400 card its not going to get any better and 2 to 4 years lifespan is not unreasonable for such a card.

Perhaps but £330 is the new normal i.e. well under half the cost of flagship cards and expecting such a card to last 4 years is quite optimistic. Turning down settings really isn't a problem, people have had to turn down settings on flagship cards before never mind mid-range cards, a setting is just a label. I have a 1070ti which is roughly comparable to a 2060 but with more VRAM, and I still turn down settings. VRAM isn't the limitation, it is more about having the raw power to deliver high framerates.

You mean like the 8800GT 256MB,which ended up failing so much within 12 to 24 months,that the slower 9600GT 512MB ended up being a better card??

People made those arguments back then,and sure it was an extreme example, but there has been a few instances even going back 16 years I can remember cards could be really limited by VRAM. For instance some of the special edition ATI 9800 series cards in prebuilt PCs which shipped with only half the VRAM of retail versions.

6GB vs 8GB is less extreme though, it isn't as if it has shipped with 4GB VRAM. Arguably with modern monitors at 144hz+ the balance shifts even more towards having GPU/CPU grunt as opposed to VRAM, although I will concede if people have 4k monitors it perhaps changes the equation somewhat.

I guess we are getting sidetracked into the old VRAM debate that has been done several times over the years but if you are talking about 8800GT vs 9600GT you need to bear in mind the 8800GT came out in 2006 and the 9600GT in 2008. So it's perhaps not all that surprising that it got superseded by something all that time later and in any case if you bought a 8800GT after two years you'd probably be considering an upgrade anyway because of technology moving on. In summer 2008 just a few months after the 9600GT came out you could pick up a GTX280 for well under £250, that was a card that totally annihilated the 8800GT/9600GT, I got one to replace a 8800GTS (I assume the GTX260 was even cheaper). So lets say you bought a 8800GT-256 and after 18 months or whatever you were sat there going 'gee this 256MB VRAM really sux', probably you were due an upgrade in any case with decent options on the market. If someone wanted a card to last for many years then perhaps it wasn't a great choice but in the noughties the idea of keeping a card for 4 years would have drawn derision in these parts.
 
Last edited:
Associate
Joined
6 Mar 2008
Posts
1,922
progress has slowed anyways, i mean upper mid range amd cards 290,390,480,580,590 are not that different in performance, power usage deffinatley better on newer cards, but no big jump in fps
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Don't forget that it only has 6GB of ram mainly due to the 192 bit bus, so no silly memory management needed.
 
Associate
Joined
17 Sep 2018
Posts
1,431
The 2060 2070 and 1070ti comparisons between 1080p, 1440p and 4k are really interesting here:


The 2060 is most ahead or on par with 1070ti at 1080p and 1440p for the most part but some games like AC Odeyssey it's performance lags as the resolution goes up just to only 1440p and at plenty of games at 4k. Interestingly that in AC Odessey it's way worse compared to the 1070ti at 1440p but at 4k not so far behind. So with PS5 textures around the corner I'd imagine you'd be turning settings down 1440p and below.

Great value for the here and now though, faster than a Vega 56 for pretty much the same price, albeit with 2gb less VRAM

Navi needs to beat/match this card with 8gb at £300. Then Nvidia might bring out an 1160
 
Soldato
Joined
9 Mar 2003
Posts
14,214
Does a 2060 even get 60fps in the latest games @4k and ‘ultra’ settings?

If not then this conversation is already irrelevant and the same old argument with the 970 and it’s vram ‘issues’.

More often than not these cards run out of puff before VRAM at 4K

Didn’t Linus do a video about 4K last last year and how pointless it is if you can’t max the details and get decent frames. He concluded lower detail and 4K is worse than native 1440 or 1080 at higher detail.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Does a 2060 even get 60fps in the latest games @4k and ‘ultra’ settings?

If not then this conversation is already irrelevant and the same old argument with the 970 and it’s vram ‘issues’.

More often than not these cards run out of puff before VRAM at 4K

Didn’t Linus do a video about 4K last last year and how pointless it is if you can’t max the details and get decent frames. He concluded lower detail and 4K is worse than native 1440 or 1080 at higher detail.

Thankfully, we have working eyes and brains and can decide for ourselves. I can assure you not even 1800p looks anywhere as good as native 4K, esp. in games relying on TAA which are the majority these days. Most settings you can max or "very high" with minimal performance impact, and the most impactful of them is texture quality which depends entirely on vram.

The 2060 would actually have been a decent budget 4K card if it had 8gb, just like the 1070ti/1080 before it. I bet it would have been equivalent to whatever the baseline PS5 performance tier is, similar to how the Polaris cards were enough for console parity all throughout their life.

And that's without even taking into account HD textures or modding.
 
Associate
Joined
17 Sep 2018
Posts
1,431
Does a 2060 even get 60fps in the latest games @4k and ‘ultra’ settings?

If not then this conversation is already irrelevant and the same old argument with the 970 and it’s vram ‘issues’.

More often than not these cards run out of puff before VRAM at 4K

Didn’t Linus do a video about 4K last last year and how pointless it is if you can’t max the details and get decent frames. He concluded lower detail and 4K is worse than native 1440 or 1080 at higher detail.

It's not just about 4k. Games textures will become increasingly more detailed and as that happens they will draw more VRAM. The 2060 is pretty much fine for most games at 1440p now but will it be on PS5 launch? Actually it seeems to suffer on AC Odessey at 1440p right now.
 
Soldato
Joined
5 Feb 2012
Posts
2,640
Be careful with these cards, Looks like am sending mines back and its 10 days old. Stuttering in every single game, pretty sure the GDDR6 is failing on it or something.
 
Soldato
Joined
9 Nov 2009
Posts
24,825
Location
Planet Earth
Perhaps but £330 is the new normal i.e. well under half the cost of flagship cards and expecting such a card to last 4 years is quite optimistic. Turning down settings really isn't a problem, people have had to turn down settings on flagship cards before never mind mid-range cards, a setting is just a label. I have a 1070ti which is roughly comparable to a 2060 but with more VRAM, and I still turn down settings. VRAM isn't the limitation, it is more about having the raw power to deliver high framerates.

6GB vs 8GB is less extreme though, it isn't as if it has shipped with 4GB VRAM. Arguably with modern monitors at 144hz+ the balance shifts even more towards having GPU/CPU grunt as opposed to VRAM, although I will concede if people have 4k monitors it perhaps changes the equation somewhat.

It really isn't - its only the new normal if people excuse it. Its a stupid argument,that since a flagship model costs XYZ that hardware enthusiasts think that mainstream models need to reflect it. So using absurd logic, say the new Ford GT40 costs £100s of 1000s,the price of a Focus needs to be £50000.

Also,I have a GTX1080 and games breach 6GB already. DX12 games appear to use more VRAM from my experience and DXR is based on that. Expect more DX12 games if Nvidia is pushing RT.

Do people think Nvidia or AMD are giving you something for nothing nowadays?? Nvidia knows very well the RTX2060 will be limited by its VRAM sooner than a RTX2070. So as time progresses the RTX2070 will start to push ahead at 1440p and 4K,and at 1080p and 1440p when DXR is used.


I guess we are getting sidetracked into the old VRAM debate that has been done several times over the years but if you are talking about 8800GT vs 9600GT you need to bear in mind the 8800GT came out in 2006 and the 9600GT in 2008. So it's perhaps not all that surprising that it got superseded by something all that time later and in any case if you bought a 8800GT after two years you'd probably be considering an upgrade anyway because of technology moving on. In summer 2008 just a few months after the 9600GT came out you could pick up a GTX280 for well under £250, that was a card that totally annihilated the 8800GT/9600GT, I got one to replace a 8800GTS (I assume the GTX260 was even cheaper). So lets say you bought a 8800GT-256 and after 18 months or whatever you were sat there going 'gee this 256MB VRAM really sux', probably you were due an upgrade in any case with decent options on the market. If someone wanted a card to last for many years then perhaps it wasn't a great choice but in the noughties the idea of keeping a card for 4 years would have drawn derision in these parts.

The 8800GT 256MB was late 2007 and the 9600GT was early 2008:

https://www.techpowerup.com/gpu-specs/geforce-9600-gt.c206
https://www.techpowerup.com/gpu-specs/geforce-8800-gt.c201

They are exactly the same uarch(G92 and G94). Some people on purpose argued for people to buy the 8800GT 256MB,and in the end performance collapsed and there were people on tech forums who got caught out.

The people who gave the advice went all silent and funnily enough none of them actually owned a 256MB card during that time. How convenient.

Mates who had the 8800GT 512MB had a card which was fine for years.

The fact is people don't seem to realise how long normal gamers keep cards - 2 to 4 years is what I see most gamers keep 60 series cards or the AMD equivalent for in the realworld,and an example was the chap here who said they had a GTX760. Not enthusiasts who change hardware quicker than their underwear.

Moreover,even Anandtech and Eurogamer thought it was a solid enough card but had concerns.

AT said:
There are hints that the 6GB framebuffer might be limiting, especially with unexpectedly low 99th percentile framerates at Wolfenstein II in 4K, though nothing to the extent that older 4GB GTX 900 series cards have experienced.

AT said:
6GB is a bit more reasonable progression compared to the 8GB of the RTX 2070 and RTX 2080, but it is something to revisit if there are indeed lower-memory cut-down variants of the RTX 2060 on the way, or if games continue the historical path of always needing more framebuffer space. The biggest question here isn't whether it will impact the card right now, but whether 6GB will still be enough even a year down the line.

Eurogamer said:
There are some further takeaways from the Battlefield 5 RTX experience, and some of them are reminiscent of the arrival of Crysis back in 2008. The importance of VRAM is a consideration here. Just like Crytek's epic back in the day, the arrival of new technology comes with big hikes in system requirements - the evidence of our testing does suggest that overloading framebuffer memory is easily doable at 1080p, and requires some tweaks.

My viewpoint is if you want to play games at 1440p,get the RTX2070,especially if you can find one for around £450. Its a bit quicker than my GTX1080,and at least has extra raytracing performance too compared to the RTX2060.

I suspect the next card you get to replace the GTX1070TI is going to have 8GB of VRAM or even more than that.

Don't make promises about 6GB being enough for the next few years at 1440p or even higher resolutions. Monitors with qHD and 4K resolutions are dropping under £200 now.

I am not going to agree with you methinks and neither are you with me,hence,I will respectfully only agree to disagree with you.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,825
Location
Planet Earth
Don't forget that it only has 6GB of ram mainly due to the 192 bit bus, so no silly memory management needed.

I don't see an issue with adding more VRAM on a 192 bit bus - the GTX660 and GTX660TI technically should have been 1.5GB but used 512MB of slower addressed RAM which extended the lifespan. I had one of those cards BTW.

The GTX970 was controversal since Nvidia didn't tell people it was doing it,but 2GB of additional slower VRAM on a RTX2060 is still better than it using system RAM.

Remember its GDDR6 so it will be significantly faster than system RAM. So if the last 2GB ran on a 64 bit bus,then it would be close to 100GB/s.

Navi needs to beat/match this card with 8gb at £300. Then Nvidia might bring out an 1160

The GTX1160/GTX1660TI is out next month apparently - so it looks to be a Turing GPU with no RTX and 20% more shaders than a GTX1060 6GB. So probably 20% faster??

I also feel if people normalise 60 series midrange cards as being close to £350,AMD will price the top midrange Navi above £300,especially if it ends up faster and has more VRAM than an RTX2060.

At this rate the midrange RTX4060 will be a £400 card and so will the AMD and Intel equivalents.

Lets hope AMD and Intel do a Ryzen strategy instead. Yeah,you heard that right,we really need Intel too as we need them to have something competitive in 2020. That way none of the companies can be complacent and start jacking up prices. However,again if people justify price increases it will end up like what the RAM cartels have done and kept pricing as high as they can.
 
Last edited:
Man of Honour
Joined
25 Oct 2002
Posts
31,735
Location
Hampshire
It really isn't - its only the new normal if people excuse it. Its a stupid argument,that since a flagship model costs XYZ that hardware enthusiasts think that mainstream models need to reflect it. So using absurd logic, say the new Ford GT40 costs £100s of 1000s,the price of a Focus needs to be £50000.

The Focus needs to be £50000 if it's also £50k for an Astra, £50k for a Golf etc and so on down the chain.... but really the car market is way more competitive with many more options. The point being, if you want something better than RTX2060, it's not like you can spend massively less. You can get better value cards for less, you can get a roughly comparable Vega card for similar money, or you can spend more and get something better.

Also,I have a GTX1080 and games breach 6GB already
Yes they do but as I said, reducing VRAM usage hasn't been that difficult to date. Lower textures, lower AA and I've never had a problem keeping it in check.
DX12 games appear to use more VRAM from my experience and DXR is based on that. Expect more DX12 games if Nvidia is pushing RT.
Valid point and one I hadn't considered.

Nvidia knows very well the RTX2060 will be limited by its VRAM sooner than a RTX2070. So as time progresses the RTX2070 will start to push ahead at 1440p and 4K,and at 1080p and 1440p when DXR is used.
RTX2070 costs about 40% more than RTX2060, so I would be worried if it didn't ultimately turn out to be a better card. The original point I was coming with was around the need to have gpu power first and foremost and then worry about it maybe coming up short in the future on VRAM. So if I compare to a weaker card with more VRAM perhaps a RX590 then for me I'd much rather have RTX2060. Perhaps a fairer comparison would be Vega56, similar money, question being do you trade a bit more grunt for a bit more VRAM. It's of course conceivable that Vega56 might be a better long-term buy, but I tend to a favour a bird in the hand.

The 8800GT 256MB was late 2007 and the 9600GT was early 2008:
Mates who had the 8800GT 512MB had a card which was fine for years.
My bad on the dates as got mixed up with the original 8 series launch, based on that the 8800GT-256 does seem a more questionable buy, but I think 256MB in late 2007 isn't really comparable to 6GB at the start of 2019. 256MB was a third of what a decent card from a year earlier like the 8800GTX had, whereas 6GB is over half what you get in a brand new 2080ti

The fact is people don't seem to realise how long normal gamers keep cards - 2 to 4 years is what I see most gamers keep 60 series cards or the AMD equivalent for in the realworld,and an example was the chap here who said they had a GTX760. Not enthusiasts who change hardware quicker than their underwear.
But are these 'normal gamers' the people who are really going to suffer on VRAM, are they going to be packing 4k monitors and demanding all settings on max? Even when I kept a card from 2012 until 2017, and it wasn't a flagship card, I don't recall hitting a VRAM bottleneck.

My viewpoint is if you want to play games at 1440p,get the RTX2070
Returning to the Ford analogy, isn't that just paying £70k for a Mondeo though? £450 is a lot for 'normal gamers' to stump up, even as a semi-enthusiast I've never spent anywhere near that on a graphics card. I'm not saying the 2070 is a bad buy, but equally I don't think its presence renders the 2060 a bad buy either, they are differentiated sufficiently in price.

I suspect the next card you get to replace the GTX1070TI is going to have 8GB of VRAM or even more than that.
Probably, but then in over 20 years of buying 3d cards, I've never bought one with less VRAM than the card it replaces, not because I've been wanting more VRAM, but because the amount cards ship with just naturally expands over time, as you'd expect. Additionally, I only got the 1070ti a few months ago, most expensive card I've ever had, so I doubt I would be looking to replace it much before 2021 unless something significantly more powerful and cheap became available.

Monitors with qHD and 4K resolutions are dropping under £200 now.

I am not going to agree with you methinks and neither are you with me,hence,I will respectfully only agree to disagree with you.

I don't disagree with everything you have said :) Look, if 4k monitors get very cheap, and DX12 takes off and means more VRAM usage, and the installed footprint of high VRAM cards rockets, and consoles have more VRAM etc etc then yes it could transpire that in 2022 the RTX2060 doesn't look that great. But my hope is, most people bothered by that would be in a position to upgrade.
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,484
It's of course conceivable that Vega56 might be a better long-term buy, but I tend to a favour a bird in the hand.

It already is.
https://www.gamersnexus.net/hwrevie...-founders-edition-review-benchmark-vs-vega-56
rtx-2060-f1-18-4k.png


This "it's easy to reduce vram usage" logic is anything but. Actually idiotic, because think about it, what's the difference between saying that and saying "it's easy enough to get as much fps as a 2060 with X/Y/Z card, just reduce settings/resolution". Not having enough vram leads to the worst downgrades possible, due to worse textures, poorer texture streaming, loading hiccups, stuttering (notice those beautiful 1% & 0.1% frametimes), etc. To say nothing of the fact that we're still in a FHD textures games context where we're not really seeing the full higher res experience (only some games grace us with such options, either directly or through mods).

So, if you don't care about visuals, sure go for less vram. If you do, you'd be shooting yourself in the foot with 6gb of vram.
 
Associate
Joined
23 Aug 2005
Posts
1,273
I've gone for a 1070 from an e-auction in the end! I'll look at getting a new 1440 IPS monitor later this year and see how the 1070 goes on that. I may get the next 2170.. whenever that will be!

I still think 2060 is a good card, depends what you want and what budget. I'm sure Raytracing is great when its been tweaked and is available in more games. VRAM is a grey area for sure. Games should have better optimisation for each tier of VRAM, 4-6-8, use all of it please (without hitching!).
 
Man of Honour
Joined
25 Oct 2002
Posts
31,735
Location
Hampshire
I don't consider it idiotic at all, as it is based on experience. It IS relatively easy, from what I've found, to reduce VRAM usage whereas lifting framerate at a given resolution can be more tricky in some games. With VRAM you nearly always get a textures setting that has a direct correlation to it. Heck several games even come with a VRAM calculator thingy, or some guidance notes on the effect of changing it.

Reducing resolution is much more problematic due to to native resolutions. Dropping textures a notch is way less impactful in most cases than dropping resolution.

I don't dispute that not having enough VRAM is bad, but I consider it largely avoidable if you have a 6GB card.

As for that graph, it's 4k ultra settings. I don't have a 4k monitor so wouldn't be using that resolution.
Furthermore, are the 0.1% low framerates really just to do with VRAM? It doesn't look like it, because other Nvidia cards with more VRAM are scoring similarly; the 2070/1080 are below 15fps as well. So actually what we are seeing is the 2060 is holding up very very well vs the 2070 with more VRAM even with settings that should favour the card with more RAM.
 
Soldato
Joined
14 Nov 2007
Posts
16,146
Location
In the Land of Grey and Pink
I've gone for a 1070 from an e-auction in the end! I'll look at getting a new 1440 IPS monitor later this year and see how the 1070 goes on that. I may get the next 2170.. whenever that will be!

I still think 2060 is a good card, depends what you want and what budget. I'm sure Raytracing is great when its been tweaked and is available in more games. VRAM is a grey area for sure. Games should have better optimisation for each tier of VRAM, 4-6-8, use all of it please (without hitching!).

I run a 1070 on a Gysnc monitor and it performs really well.

Especially when the Vulcan API is used. Why isn't this the standard above DX11/12?
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Enough vram vs not enough vram - or if you ever want to use HD textures. Look especially at 1:24!


You can't negotiate your way out of a lack of vram! 50% faster for the 6gb. That's more than a generational leap. Hell, today that's like 2.

It just makes me sad that they couldn't add another 2 gb. I play at 4K with an RX480. This 2060 is twice as fast, or thereabouts. Why shouldn't it play at 4K? Was well enough priced (globally) too. If the Pascal cards weren't gimped in 4K & HDR I wouldn't even give it another thought, but alas, it's always a step back somewhere with Nvidia.
 
Last edited:
Man of Honour
Joined
25 Oct 2002
Posts
31,735
Location
Hampshire
AC Odyssey only came out a couple of months ago, and is struggling on an old card that came out more than two years prior, with pretty low VRAM for the time - this isn't that surprisingly to me. I deliberately avoided the 1060-3GB card because I considered that too low.

So anyway let's look at the VRAM usage for it: https://www.overclock3d.net/reviews/software/assassin_s_creed_odyssey_pc_performance_review/13
So at 1440p even in ultra high settings it uses under 6GB.
At 4k, things get more interesting, but as I said before, very easy to solve the VRAM utilisation, drop to High settings and *BOOM*, using under 5GB VRAM. If you look at the performance numbers for 4k, you aren't going to want to use higher than High settings anyway, because even the 1080 is hitting only 35fps. This comes back to my point about, you just need enough power first and foremost, it's all well and good saying yeah yeah run out of VRAM when you max it out innit but fundamentally some of the games are just too slow at those settings irrespective of VRAM. It's that age old problem that dates back to I dunno, the days of ti4200 64MB vs 128MB, whereby in order to manufacture a scenario where the higher VRAM card is a clear winner, you've had to render both cards useless by maxing out the settings so much that framerate is low on both. Yippe I get 25fps instead of the 20fps I'd get with less VRAM, shame it is unplayable anyway!

I've yet to see a game that performs brilliantly on a card with very high VRAM and rubbish on a card that is more powerful with medium-high VRAM (6GB), maybe they exist but must be pretty niche/rare and sure easy to fix with slight tweak to settings.
 
Last edited:
Back
Top Bottom