• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12 and 16GB GPU is NOT "A Gimmick"

No point in 12GB on a card that can barely handle 1440p 60fps minimum in most games, it's pointless so I agree with them. 8GB would have been fine and lowered the cost slightly.

3070 and above should have had more memory though.

8Gb would require a 256bit bus, the 3060 has a 192bit bus to keep costs down. GDDR6 is only available in 1Gb and 2Gb chips. Each chip has a 32bit bus. 6 chips multiplied by 32bit = 192bit.
 
The features that make it worth buying are for things like broadcast, DLSS, rtx voice etc.

It is definitely not for people already on 5700XT / 2060 cards as you have pointed out, you dont really get any boost to performance just a dent in your wallet.
 
1660Ti / Super still manage fine with 6GB, not ideal but it would have made more sense than 12GB.

Aren't they releasing a 6Gb version as well though? Just like they released two versions of the 1060. Then there's the 3050Ti etc. There's a lot of price gaps to fill between the entry level sub £100 GPU and the 12Gb 3060.
 
No point in 12GB on a card that can barely handle 1440p 60fps minimum in most games, it's pointless so I agree with them. 8GB would have been fine and lowered the cost slightly.

3070 and above should have had more memory though.

Clearly you forgot to read the OP which described why it couldn't be 8GB :rolleyes:
 
Another part of this question is - does the increase in the cost of the card due to more ram get nullified by the general increases in the cost of all gfx cards?

The (base) 3060 is completely pointless as a gfx card imo (irrespective of price or amount of ram ) , they should have just bought out the Ti version and be done with it.


OP mentioned "multiples of 3GB or 6GB" - I presume they couldn't have done 9GB? I know a lot of system ram there are performance benefits to installing 2* sticks, wasn't sure if this was the case with GDDR on gfx cards as well

So the 3060 has 12GB, for that card maybe it is overkill, but i'm sorry these people who think themselves so clever are so stupid they don't realise why Nvidia did that.
With a 192Bit bus it can only have Vram in 3GB or 6GB multiples, people are not happy with 6GB of Vram on a midrange card, its why when Nvidia upgraded the 2060, which had a 192Bit bus and 6GB to the 2060 Super with a 256Bit bus and 8GB, Nvidia wanted the 3060 to be a more efficient design, so a smaller bus, 192Bit, that meant 6GB or 12GB, 6GB was pretty much out of the question so it had to be 12GB.


although admittedly it does seem a strange to be commenting on such a relatively small price differential when the rest of the kit is going to probably require high end components to go with it (high res monitor, decent cpu and possibly ram etc etc).

The other thing I would also suggest is that the op mentions a cost of be $20 for the additional RAM (Im presuming that's to NV) but by the time the card actually gets to the street (even when prices are more normal) that will have probably risen to something around $60, and right now probably over $100


I personally think there are too many performance tiers that make a lot of "price brackets" redundant. Why not have three current gen products and rely on three previous gen products to go between them - and then when the 3rd gen comes out the 1st gen goes EOL - I really cant see the point of having so many different tiers (and this doesn't include the 3090/6900XT - which should go EOL as soon as the next gen comes out as these barely make sense as current gen products let alone previous gen )
 
OP mentioned "multiples of 3GB or 6GB" - I presume they couldn't have done 9GB?

GDDR6 is only available in 1Gb and 2Gb chips. Each chip has a 32bit bus. So nine 1GB chips would require a 288bit bus. Only six 1Gb or six 2Gb chips add up to the required 192bit that the 3060 has.

There used to be a work-around like they did with the 660Ti where there were 2 extra memory chips on the back of the PCB which shared their bandwidth with an existing 2. So the first 1.5Gb VRAM had a 192bit bus but the extra 0.5Gb had a 96bit bus so ran at half speed.

There was also another work-around like they did with the 550Ti where they used two different sized memory chips to achieve a non-standard amount of VRAM, a technique called asymmetrical memory.

It`s been a long time since VRAM has been bodged in this way so 6Gb and 12Gb is the only option with a 192bit bus.
 
Last edited:
Yeah I do remember the 4 cores and even some on this forum spouting the '2 cores for gaming' not so long ago..

I remember everyone saying "you don't need an i7 for gaming an i5 is plenty!" so I bought an i5 4690. Well it turned out the the i7 4790 had much longer legs and had to swap it out for one of those in time. Won't make that mistake again.

You get the sense they are trying to walk a tight rope of cowing to Nvidia while still trying to be independent, JayZ2Cents looks like he's had enough, like he's lost interest in GPU reviews, he gets sent these Nvidia cards and he's running through the script like someone has a gun to his head.

At the end of the day thats all that matters isn't it? The "influencers" are trotting out their speeches / towing the party line so everythings fine from Nvidia's POV...
 
Last edited:
I remember everyone saying "you don't need an i7 for gaming an i5 is plenty!" so I bought an i5 4690. Well it turned out the the i7 4790 had much longer legs and had to swap it out for one of those in time. Won't make that mistake again.

Agree mate, always stretch what you can afford so it doesn't bite you later. I always get crossfire/sli motherboards for example just in case, I know its pretty dead now but I hate mATX tiny boards anyway.
 
I remember everyone saying "you don't need an i7 for gaming an i5 is plenty!" so I bought an i5 4690. Well it turned out the the i7 4790 had much longer legs and had to swap it out for one of those in time. Won't make that mistake again.

But an i5 was all you needed at the time so they were right. It`s up to the individual if they want to future proof themselves.

Right now, people are saying the 5900X is overkill for gaming and I agree. However, it`s blatantly obvious that it will outlast the 6 and 8 core chips. When people give advice it`s based on the here and now.
 
But an i5 was all you needed at the time so they were right. It`s up to the individual if they want to future proof themselves.

Right now, people are saying the 5900X is overkill for gaming and I agree. However, it`s blatantly obvious that it will outlast the 6 and 8 core chips. When people give advice it`s based on the here and now.

Here's the problem with this mindset of "future proofing" though.... it is somewhat pointless because fast forward to when games really need the extra CPU power, the newer budget entry level CPUs are still going to be far better than older top end CPUs

i.e. in my case, my ryzen [email protected] isn't really cutting it for games like cyberpunk but neither is a 2700x etc. and a 5600x will be far better than the top end 2xxx ryzen CPU so essentially you can save yourself £100/200+ now and put that towards a much better entry level CPU come the time when more power is needed i.e. a 5900x might last longer than a 5600x but the entry level ryzen 7600/8600 (which will probably move to 8 cores as the default instead of 6 anyway) is going to be far better than a 5900x for just gaming in 2/3+ years time.

Of course, this comes largely down to if the motherboard and RAM would also need to be upgraded to work with newer gen CPUs.
 
Last edited:
Here's the problem with this mindset of "future proofing" though.... it is somewhat pointless because fast forward to when games really need the extra CPU power, the newer budget entry level CPUs are still going to be far better than older top end CPUs

i.e. in my case, my ryzen [email protected] isn't really cutting it for games like cyberpunk but neither is a 2700x etc. and a 5600x will be far better than the top end 2xxx ryzen CPU so essentially you can save yourself £100/200+ now and put that towards a much better entry level CPU come the time when more power is needed i.e. a 5900x might last longer than a 5600x but the entry level ryzen 7600/8600 (which will probably move to 8 cores as the default instead of 6 anyway) is going to be far better than a 5900x for just gaming in 2/3+ years time.

Of course, this comes largely down to if the motherboard and RAM would also need to be upgraded to work with newer gen CPUs.
From Tom's hardware a 2700x is 20% faster than a 2600x with RT on 1080p on CP2077.
While both are under 60 fps I think a person with a 2600x will NEED to upgrade before a person with a 2700x will have to.

The person with a 2700x may be able to last one more generation. Or to put it another way. A person with a 2600x is probably looking for an upgrade right now while someone with a 2700x could be satisfied till the zen 4 comes out.
 
From Tom's hardware a 2700x is 20% faster than a 2600x with RT on 1080p on CP2077.
While both are under 60 fps I think a person with a 2600x will NEED to upgrade before a person with a 2700x will have to.

The person with a 2700x may be able to last one more generation. Or to put it another way. A person with a 2600x is probably looking for an upgrade right now while someone with a 2700x could be satisfied till the zen 4 comes out.

Have a look at videos showing the 1% and 0.1% lows, ryzen 2700x CPU is pretty poor where as a 5600x trashes it in cyberpunk. Cyberpunk is the first game to really show up my [email protected].

Plus most PC gamers on here will be playing at 1440P or/and 144HZ too.

You're right though, people with the higher end CPUs could maybe hold of a bit longer than someone on a entry/budget CPU like the 2600(x) but ultimately, I imagine most would be wanting to upgrade sooner than later and that extra £100/200+ saved could allow them to buy new gen sooner than later.
 
Cyberpunk is a poor game to base this on as it is likely rushed and unoptimised. Maybe one day it will get a few patches or some love. It brings lots of good systems to a crawl so hardware from 2+ more years ago is not exactly going to shine on this title.
 
Have a look at videos showing the 1% and 0.1% lows, ryzen 2700x CPU is pretty poor where as a 5600x trashes it in cyberpunk. Cyberpunk is the first game to really show up my [email protected].

Plus most PC gamers on here will be playing at 1440P or/and 144HZ too.

You're right though, people with the higher end CPUs could maybe hold of a bit longer than someone on a entry/budget CPU like the 2600(x) but ultimately, I imagine most would be wanting to upgrade sooner than later and that extra £100/200+ saved could allow them to buy new gen sooner than later.
I didn't argue whether or not the 2700x stands up to the 5600x. I'm purely talking about longevity and the need to upgrade.

With regards to using the money saved to buy the next gen. It depends on a load of variables that you cannot possibly foresee on the day of purchase for example a persons financial position when the new tech is released, is there anything else that must change in the PC to support this new tech.

The other way of looking at the cost calculation is cost per years owned. If the extra £100/£200 can double the useable lifespan of a product is it worth it?

At the end of the day regardless of which option you take (future proofing vs saving money and getting what works now) it is nothing more than a guessing game.
 
At the end of the day regardless of which option you take (future proofing vs saving money and getting what works now) it is nothing more than a guessing game.

Yes the "professional" reviewers who make a living out of reviewing the hardware don't always get it right. I remember the Ryzen R3 3300X getting good reviews when it came out in May 2020. Now less than a year later its being found to cause stutter in some games like Cyberpunk compared to the Ryzen 6 cores and 12 threads cpus that were released before it eg the 2600. I don't think that 4 core and 8 thread cpus are going to cut it for some AAA games in the future
 
Back
Top Bottom