• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12 and 16GB GPU is NOT "A Gimmick"

More RAM on anything is better.

I can 100% prove.

People say you don't need more than 32GB as an absolute max for gaming, wrong. My modified cities Skylines save hits 38gb.

People say you dont need more than 8gb VRAM, well on my current 8GB EVGA 1070 I've hit 7.5GB on Fallout 4 with some graphics mods and 1440p. And that game is several years old now.....

More RAM on anything is better, there is no such thing as overkill and it's not marketing ********.

In fact, if anything, I'd accuse Nvidia of not shipping their GPUs with enough VRAM, historically has been the case Vs ATI/AMD cards that generally over the years always had more.
 
I share your distaste for many of Nvidia's antics but the 980TI had 6GB VRAM vs Fury X 4GB, the 6990 and the 295x2 both had less than ideal amounts thus hastening their demise so both parties are guilty of cheaping out in this department. Dual chip cards especially should have been given more than twice the VRAM of the single GPU cards like the 4870x2 did so admirably but corners have often been cut.

Fury x was a totally different case though as HBM was really only doable with a 4 gig config at the time and even then in tiny numbers. They probably hoped 6 gigs was doable but had to go with what they had at the time as it wasn't like they could do a redesign to use gddr and have it out the door in a few months. As someone else pointed out board complexity can also attribute to less ram being used, especially in the case of dual chip configs. The 295x2 had something like 1000 memory traces laid out for it and it was a very complex pcb, adding more would only add to the complexity and cost and price it out of most people's range.

It's not always about cutting corners in that respect, in some instances it's simply a technical limitation at the time, HBM being the prime example.
 
Some games are using quite a bit more vram than 8GB though, in those games, particularly if SAM is working well there could be a difference. If you want a card you can keep for a few years I don't think 12GB is a bad place to be. I do think though that many games should still run well on 8GB cards like the 3070/3060ti at 1080p with settings turned down for sometime to come, especially with upscaling. It might be that Nvidia didn't think that vram would be so important when upscaling is used in AAA games

https://youtu.be/MmSFroe7TVk?t=181
 
Feel free to post something showing otherwise then.....

290/x and 390/x are about as equal as you can get in terms of raw performance and shock horror, as shown, more vram does not magically have them gain 10+ fps and if they do gain anything extra, they're still too weak anyway.

PS. I still have a 290 too so know fine well how the card performs, 1920x1080P 60HZ, be a good enough card but not in newer games and certainly not for 3440x1440 144HZ or even 1440P/4k 60HZ on my tv, not to mention, no ray tracing, dlss, HDR support etc.

I have had the pleasure of comparing an R9 290 4GB vs an R9 390 on my personal rigs, at 1080p they are largely the same. The 290 couldn't handle 1440p though.

As for the 'R9 390 is not a capable card' it's still playing everything me and the wife want to play at acceptable framerates, that is capable. Is it on Medium graphics? Sure... But my 1080Ti never managed Ultra even when new, and that cost 3x more 3 years later.....
 
Those that use davinci resolve will tell you that these 3060 12gb cards are no gimmick. Or they probably will so you dont buy one in the hopes it leaves more stock for them.
 
The 3060 won't be beating a 3080 10gb anytime

How do you know? If a game saturates 11 GB and the 10GB on RTX 3080 is not enough, it will begin to drop the frame rate to very low numbers because it will begin transferring textures from the much slower main memory and the SSD.

At the same time, the 12 GB on the 3060 might be just enough not to do it and keep normal for its performance capacity framerate.

More RAM on anything is better.

I can 100% prove.

People say you don't need more than 32GB as an absolute max for gaming, wrong. My modified cities Skylines save hits 38gb.

People say you dont need more than 8gb VRAM, well on my current 8GB EVGA 1070 I've hit 7.5GB on Fallout 4 with some graphics mods and 1440p. And that game is several years old now.....

More RAM on anything is better, there is no such thing as overkill and it's not marketing ********.

In fact, if anything, I'd accuse Nvidia of not shipping their GPUs with enough VRAM, historically has been the case Vs ATI/AMD cards that generally over the years always had more.

More memory is always better because any hardware works better with less percentage of occupied performance capacity.
If your memory is at 10%, it will be faster than if it's at 90%.
 
FarCry 5 seen very good gains going from 16GB to 32GB and also faster Ram helps some games but in other games zero difference when I was looking into this last week.
 
More RAM on anything is better.
Of course more VRAM is better in absolute terms. But a £250 6GB 3060 would be a better product than a 12GB £300(+++++) one. Most people are looking for great price/performance, not the absolute most beastly card they can possibly buy. The 980 Ti was clearly a better buy than the Titan X for a gamer, despite having half the VRAM, because it offered much better price/performance. The 12GB on the 3060 really just ****s everybody over and is 100% a marketing decision, rather than a practical one. The card was built with 6GB VRAM in mind. The choice of bus width and performance of the core make that perfectly clear.

But Nvidia then saw that AMD were slapping a ton of VRAM on their cards, so felt they needed to compete on paper. The very notion that they originally went into designing this product stack with the idea of having an x60 card launch with more VRAM than the x80 card is so utterly ridiculous that I refuse to believe that anybody really thinks that's the case. It was a reactive move, and since it was too late to change the memory bus width, all they could do is go to 12GB. And, given how ridiculously bunched up the 3060, 3060 Ti and 3070 are, the price was clearly raised to cover that.

So now, entirely because of marketing **** waving, we have another overpriced card that nobody can buy, which barely offers better price/performance than a product launched two years ago. And the £200-250 range which used to be the sweet spot for tons of people continues to go completely neglected in the process.
 
... given how ridiculously bunched up the 3060, 3060 Ti and 3070 are, the price was clearly raised to cover that.

So now, entirely because of marketing **** waving, we have another overpriced card that nobody can buy, which barely offers better price/performance than a product launched two years ago. And the £200-250 range which used to be the sweet spot for tons of people continues to go completely neglected in the process.

This.

Some of us have mentioned/discussed this over the past three years, particularly pointing out that the Turing release solidified that the pricing was now broken, there was effectively no 'mid-range' anymore and chiefly nividia were just releasing tens of card flavours instead of just having simple three tiers. To counter anything AMD do, like that period, they pushed out the supers.

We were almost going in the right direction with the 3080 being showcased as a £649 card, which should mean the 70 and 60 being staggered lower than that. Its now a ******** of high prices with people parting with way more than MSRP values, the £3-500 band is an absolute mess.
 
I think its important to know that its not these tech influencers to decide what we need or don't, can they go f themselves please.

Exactly this. Nvidia and AMD know what combination of memory interface and core count a card needs to meet a price point and target audience; far better than people who press “Run Benchmark” ever will.

In a perfect world they probably would’ve put the 3060 to 8GB, but then it’d require a wider interface.
 
Exactly this. Nvidia and AMD know what combination of memory interface and core count a card needs to meet a price point and target audience; far better than people who press “Run Benchmark” ever will.

In a perfect world they probably would’ve put the 3060 to 8GB, but then it’d require a wider interface.

Not necessarily wider interface. It could have been narrower with faster memory clocks.

It's just poor design choice by incompetent Nvidia engineers. The entire 3000 series is a joke - 24GB -> 10GB -> 8GB -> 12GB and what's not.
 
Also, it's not the fully enabled chip - it's with 3840 shaders and 256-bit interface which for the retail market is cut down to only 3584 shaders and a 192-bit MI.
 
@Th0nt you should make a 24GB is not a gimmick thread mate. It has 6GB extra vram and the proper fast GDDR6 X variant. Helps smooth out frames and in 6 years time all that extra vram will come in handy, I mean just look at the R9 390 :p

In the mean time I would have had 3 upgrades at least in that time recycling the same money adding £100-£200 on top each time (still less money overall than a 3090). Even the third card may not have 24GB vram but will have 2-3 times the Rasta and RT performance :p:D
 
Back
Top Bottom