• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The 16: GTX 1660 Ti, GTX 1660 and GTX 1650

Associate
Joined
30 Jan 2003
Posts
1,514
Location
Cardiff
The GTX 1660 Ti, featuring 1536 CUDA cores is expected to launch on February 15th for around 279 USD. A cheaper variant featuring GDDR5 memory instead of GDDR6 and fewer cores (1280) called GTX 1660 non-Ti will launch in early March. This SKU is expected to cost 229 USD.

However, this is not where GTX 16 series end. According to HardOCP, NVIDIA will launch GTX 1650 for 179 USD later, probably in late March.

This would ultimately confirm that GTX Turing will stick to GTX 16XX naming schema. An unexpected choice for sure.


https://videocardz.com/79847/hardocp-nvidia-geforce-gtx-1660-ti-to-cost-279-usd
 
Associate
Joined
15 Oct 2018
Posts
1,293
6 GB across the whole range :rolleyes:

The biggest problem with the RTX 2060 is the 6 GB. NVidia really seems to want people to go RTX 2070+ to get 8 GB, but the Vega 56/64 GTX 1070 Ti/1080 are already there. It's like they've tricked themselves in to thinking they're releasing their cards in a vacuum where nothing before the RTX 2xxx series/anything from AMD exists.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
6 GB across the whole range :rolleyes:

The biggest problem with the RTX 2060 is the 6 GB. NVidia really seems to want people to go RTX 2070+ to get 8 GB, but the Vega 56/64 GTX 1070 Ti/1080 are already there. It's like they've tricked themselves in to thinking they're releasing their cards in a vacuum where nothing before the RTX 2xxx series/anything from AMD exists.

You forgot the RX 590/580 and also the £159 RX 570!!!!! So everything over £158 has 8GB on AMD lineup.

At least Nvidia doesn't sell any 3GB version.... but I might be wrong here :D
 
Associate
Joined
15 Oct 2018
Posts
1,293
You forgot the RX 590/580 and also the £159 RX 570!!!!! So everything over £158 has 8GB on AMD lineup.

At least Nvidia doesn't sell any 3GB version.... but I might be wrong here :D

You may rest assured my friend there will always be a 3 GB version from NVidia, with one variant of the mighty GTX 1660 sporting just that :D With that in mind, I wonder just how low the 1650 can go. Exciting times :D
 

LiE

LiE

Caporegime
Joined
2 Aug 2005
Posts
25,645
Location
Milton Keynes
6 GB across the whole range :rolleyes:

The biggest problem with the RTX 2060 is the 6 GB. NVidia really seems to want people to go RTX 2070+ to get 8 GB, but the Vega 56/64 GTX 1070 Ti/1080 are already there. It's like they've tricked themselves in to thinking they're releasing their cards in a vacuum where nothing before the RTX 2xxx series/anything from AMD exists.

 
Associate
Joined
15 Oct 2018
Posts
1,293

And elsewhere the GTX 1070 8GB thrashes the more powerful RTX 2060 6GB at 4K.

I've also straddled this memory line with a GTX 970 at 3440x1440, and was amazed at the efficiency of the drivers to be able to run at 3.5 - 4 GB. Then you push it one notch over with antialiasing/textures, and the performance collapses, like the 2060 did at 4 K vs the GTX 1070 in Wolfenstein something or other (not familiar with those games).

If Nvidias big shout this generation is 'more efficient use of memory! less is moar!' then fine, but it's not. The 6 GB VRAM in the 2060 series and lower is blatantly about railroading potential buyers into the 2070+ if they want the reassurance of 8GB of memory. Except they're arrogant enough to pretend other 8GB cards aren't available / already commonplace.

Maybe it's all psychological and VRAM is utterly irrelevant, but it'll take Nvidia a while to program us with that information any time soon as we're addicted to decent VRAM.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
And elsewhere the GTX 1070 8GB thrashes the more powerful RTX 2060 6GB at 4K.

I've also straddled this memory line with a GTX 970 at 3440x1440, and was amazed at the efficiency of the drivers to be able to run at 3.5 - 4 GB. Then you push it one notch over with antialiasing/textures, and the performance collapses, like the 2060 did at 4 K vs the GTX 1070 in Wolfenstein something or other (not familiar with those games).

If Nvidias big shout this generation is 'more efficient use of memory! less is moar!' then fine, but it's not. The 6 GB VRAM in the 2060 series and lower is blatantly about railroading potential buyers into the 2070+ if they want the reassurance of 8GB of memory. Except they're arrogant enough to pretend other 8GB cards aren't available / already commonplace.

Maybe it's all psychological and VRAM is utterly irrelevant, but it'll take Nvidia a while to program us with that information any time soon as we're addicted to decent VRAM.


It has nothing to do with that. Simply, Nvidia's GPU don;t need the higher bandwidth that necessitates more VRAM chips, this lowers prices and reduced power. AMD don't make 8GB card out of the goodness of their hearts, just they need the 256bit memory interface. Especially since they have yet to adopt GDDR6
 
Associate
Joined
15 Oct 2018
Posts
1,293
It has nothing to do with that. Simply, Nvidia's GPU don;t need the higher bandwidth that necessitates more VRAM chips, this lowers prices and reduced power. AMD don't make 8GB card out of the goodness of their hearts, just they need the 256bit memory interface. Especially since they have yet to adopt GDDR6

I'll be receptive to you providing a source explaining that. Otherwise, WTF? You saying VRAM only works because it's daisy chained like dual-channel/quad-channel system RAM (but more so), and where memory bus width is everything and capacity is an afterthought? Well outside my know-how if so.

Also, if AMD want their maximum bit memory interface and requires x number of chips to do so, then they could presumably do it with 4 GB / 6 GB cards with smaller capacity memory chips in the exciting new era where nVidia offer only 6GB cards under the £500 price tag. Shame on AMD doing 8GB at £150 with RX570s to Vega 64s at £400+ with 8 GB of wasteful memory when 6 GB will apparently do.
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,484
VRAM usage results are almost always misleading, simply because all the variables aren't isolated so people are comparing apples and mangoes. VERY FEW games impose strict vram cut offs, so you're eating apples thinking it's a mango, and marvel that it's at the same price. They're not & it's not. They really also needed to add an AMD card to compare and then actually examine the visual output. Instead what happens is that there's a lot of corners being cut & people simply don't pay attention. Just like how DLSS/Reconstruction techniques can pass for 4K for many.

 
Caporegime
Joined
18 Oct 2002
Posts
32,618
I'll be receptive to you providing a source explaining that. Otherwise, WTF? You saying VRAM only works because it's daisy chained like dual-channel/quad-channel system RAM (but more so), and where memory bus width is everything and capacity is an afterthought? Well outside my know-how if so.

Also, if AMD want their maximum bit memory interface and requires x number of chips to do so, then they could presumably do it with 4 GB / 6 GB cards with smaller capacity memory chips in the exciting new era where nVidia offer only 6GB cards under the £500 price tag. Shame on AMD doing 8GB at £150 with RX570s to Vega 64s at £400+ with 8 GB of wasteful memory when 6 GB will apparently do.


it is the very basics of how memory works. GDDR has a 32bit bus width, so that each VRAM chip has 32bit memory lanes. If your GPU need the bandwidth afforded by a 256bit bus then you need 8 VRAM chip. If you only need 192bit then you only need 6 VRAM chips. Chips have different densities, at the moment 1GB is fairly standard and affordable, I'm not sure what other sizes are available if any. Thus the 2060 end up with 6x1GB, and the RX580 with 8x1GB.

For nvidia to increase the Vram on the 2060 then they could have some GDDR6 chips share a single 32bit bus. But this means these chips would have to share memory bandwidth. Nvidia have done this in the past with lower cards. However, Just look how much fuss Nvidia got into with the 970 and its memory configuration when some 0.5GB memory was slower. That huge outcry over nothing has pretty much destined the 2060 not to have 8Gb VRAm with such an arrangement.

The alternative is to create a 256bit memory interface, which adds to the die size and production costs. This is basically what the 2070/2080 is.

IN theory if there are different density GDDr6s then the vram capacity can also change. This might happen with the lower end 1650 cards with a 3GB option. If there was a double density chip available you could have a 2060 with 12GB of memory at a huge cost but without any more bandwidth so the extra memory would be largely useless.

I also expect that the existence of GDDR6 has actually made the case for 8GB weaker because 192bit bus is providing plenty of bandwidth, especially with nvidia's efficient design. Next generation will probably see the 3060 move to a 256bit interface and 8Gb vram.
 
Associate
Joined
15 Oct 2018
Posts
1,293
VRAM usage results are almost always misleading, simply because all the variables aren't isolated so people are comparing apples and mangoes. VERY FEW games impose strict vram cut offs, so you're eating apples thinking it's a mango, and marvel that it's at the same price. They're not & it's not. They really also needed to add an AMD card to compare and then actually examine the visual output. Instead what happens is that there's a lot of corners being cut & people simply don't pay attention. Just like how DLSS/Reconstruction techniques can pass for 4K for many.


This seems to be saying the Nvidia graphics cards are tailored to reducing image quality when loading up (or juggling) textures (when presumably at the cards VRAM limit) - using the 970 as an example, a card I'm familiar with.

Rise of The Tomb Raider at 3440x1440 was the first game that overwhelmed my GTX 970's VRAM limit, and it showed. It wasn't some subtle denigration in visual quality, it was crashing frame rates until I turned settings down, particularly textures and AA.

Considering that game released in 2015 and killed 4 GB (or 3.5GB) of VRAM at slightly more than 1440p resolution, that's a tad disconcerting for the future.

it is the very basics of how memory works. GDDR has a 32bit bus width, so that each VRAM chip has 32bit memory lanes. If your GPU need the bandwidth afforded by a 256bit bus then you need 8 VRAM chip. If you only need 192bit then you only need 6 VRAM chips. Chips have different densities, at the moment 1GB is fairly standard and affordable, I'm not sure what other sizes are available if any. Thus the 2060 end up with 6x1GB, and the RX580 with 8x1GB.

For nvidia to increase the Vram on the 2060 then they could have some GDDR6 chips share a single 32bit bus. But this means these chips would have to share memory bandwidth. Nvidia have done this in the past with lower cards. However, Just look how much fuss Nvidia got into with the 970 and its memory configuration when some 0.5GB memory was slower. That huge outcry over nothing has pretty much destined the 2060 not to have 8Gb VRAm with such an arrangement.

The alternative is to create a 256bit memory interface, which adds to the die size and production costs. This is basically what the 2070/2080 is.

IN theory if there are different density GDDr6s then the vram capacity can also change. This might happen with the lower end 1650 cards with a 3GB option. If there was a double density chip available you could have a 2060 with 12GB of memory at a huge cost but without any more bandwidth so the extra memory would be largely useless.

I also expect that the existence of GDDR6 has actually made the case for 8GB weaker because 192bit bus is providing plenty of bandwidth, especially with nvidia's efficient design. Next generation will probably see the 3060 move to a 256bit interface and 8Gb vram.

Okay, sounds like you know what you're talking about (more than I do right now that's for sure). Just wondering, and this is probably a newb question; if VRAM needs to be neatly divisible to fit into the memory bus, what's the deal with the 11 GB GTX 1080 Ti?

NVM: Figured it out: 1080 Ti has a 352 bit bus width, divided by 11 = 32, the same figures you quote for each GDDR6 memory chip.

Interesting stuff...
 
Last edited:
Soldato
Joined
26 May 2014
Posts
2,955
Chips have different densities, at the moment 1GB is fairly standard and affordable, I'm not sure what other sizes are available if any.
I bet if you really put your mind to it, you could work out what size the VRAM modules are on a 4GB 470/480/570/580. :p

You're wrong about Nvidia cards not "needing" the bandwidth though. Nvidia cards can and do benefit from memory overclocking, all the way up to the high end with their 352-bit memory buses. The idea that Nvidia cut down on memory bus width because it wouldn't help performance is simply wrong - they do it because it saves money and the cards perform well enough without the increased bandwidth.

To say that they wouldn't or don't benefit from more bandwidth is simply inaccurate though. They demonstrably do, especially at higher resolutions. You don't even need to do any overclocking to prove that, as Nvidia released Pascal cards with faster memory as seperate products, and they performed better as a result of that extra bandwidth.

https://www.forbes.com/sites/antonyleather/2017/06/13/nvidias-new-gtx-1080-with-11gbps-g5x-memory-tested-how-much-faster-is-it/

Nvidia's relatively-stingy memory buses in the past decade have been one of the reasons why AMD cards are usually quite a bit closer in performance at 4K than at lower resolutions (where the extra bandwidth isn't so important). Of course, it's also somewhat title-dependent, with some games benefiting from more memory bandwidth much more than others.
 
Soldato
Joined
13 Aug 2012
Posts
4,277
Indeed, but the 2060 still beats many 8GB cards.


Also have a look here - https://youtu.be/D4lE2T4wWxM?t=124

Yeah I just had a look seems like you can just ignore the limits on the game according to vid. edit vid is wrong.

I think it's nice to have more than you need though. I suppose you never know how much the next big titles going to use.


edit;
I just tested for my self and going over vram limit crashes framerate in some parts of the game.

Fighting this big boss in lab my frame rate is tanking when im going over my 8gb limit.
 
Last edited:
Back
Top Bottom