• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

amd or nvidia for BO6?

Speaking of the 3060ti, i was looking at those as while i dont know what good prices are, they come up at the lower end of the budget im setting and obviously been rated highly over the 4000 series for awhile, however i am a little dubious if they are too costly for how old they are. id like to think the 3060 will run BO6 really well.
I don't really follow used prices so couldn't comment there, but for new the 3060 is fine at ~£250. The only snag is that the 4060 has also come down and they're literally competing in the same space. Daniel Owen did a long video comparing the two and his conclusion was that despite the 12GB of VRAM, he'd probably take the 8GB 4060 instead over the 3060. I understand his reasoning there and I'm coming around to it too.

The 3060 Ti, it is a decent card, roughly comparable to the 6700 XT, but to simplify things in my own mind I have a cut off (with new cards) where I just say "8GB = nope", usually because someone is looking at 1440p and I just don't think 8GB makes sense anymore for anything except a low cost entry level gaming card for 1080p.
 
Last edited:
I don't really follow used prices so couldn't comment there, but for new the 3060 is fine at ~£250. The only snag is that the 4060 has also come down and they're literally competing in the same space. Daniel Owen did a long video comparing the two and his conclusion was that despite the 12GB of VRAM, he'd probably take the 8GB 4060 instead over the 3060. I understand his reasoning there and I'm coming around to it too.

The 3060 Ti, it is a decent card, roughly comparable to the 6700 XT, but to simplify things in my own mind I have a cut off (with new cards) where I just say "8GB = nope", usually because someone is looking at 1440p and I just don't think 8GB makes sense anymore for anything except a low cost entry level gaming card for 1080p.
The price i saw a 3060ti was new, i havent looked at used, but the 2060 is only now at what id consider a decent used price, so the 3000 series will questionable. I dont have to buy new as such, i just dont find some used prices as good value in comparison.

I will have to look for that comparison video. but it is interesting the 3000 series apparently is meant to be superior to the 4000 or least below a 4080, but someone would choose the 4000 version still. So many factors and mix reviews.

the 1070ti i had which i kept to stock settings could do a regular 70+ fps with the odd dips to 50s in BO6 with as high of settings and rendering i could get to fit in with the vram limit and that was at 1440p. Despite the dips the game was smooth and had no issues. That to me is very good though i dont know what a good fps is for 1440p high settings? the extra vram above 8gb would be mainly to increase a few quality settings for that game and Cyberpunk 2077 otherwise im not too particularly fussed on vram as long as its not below 8.
 
I will have to look for that comparison video. but it is interesting the 3000 series apparently is meant to be superior to the 4000 or least below a 4080, but someone would choose the 4000 version still. So many factors and mix reviews.
I'd say that (this is from a POV of performance, not comparing used prices to new):

For the same price: a 3060 12GB is a better card for creators, but the 4060 8GB is a better card for gamers. I can't remember his reasoning, but it does have higher performance and lower power consumption. You can skip to the final thoughts with the timestamps:

The 4060 used to be more expensive, like £300+ and the 3060 12GB available near £250, so it was a different choice then, though I've seen the 3060 12GB come down toward £200 in the last few days.

The 3060 Ti and 4060 Ti are pretty close in performance, but it is consistently faster in a PCI-E 4.0 board and more energy efficient. Nvidia cut the lanes to 8 (which doesn't help in PCI-E 3.0 boards) and cut down the bus, which means that it loses performance against the 3060 Ti as the resolution goes up.

The 4070 I'd consider just better than the 3070. It has significantly more performance, higher VRAM and modest power consumption. The same goes for the 4070 Super, which in the absence of deals for the 4070 is looking the better buy now that you can get one around £500.

the 1070ti i had which i kept to stock settings could do a regular 70+ fps with the odd dips to 50s in BO6 with as high of settings and rendering i could get to fit in with the vram limit and that was at 1440p. Despite the dips the game was smooth and had no issues. That to me is very good though i dont know what a good fps is for 1440p high settings? the extra vram above 8gb would be mainly to increase a few quality settings for that game and Cyberpunk 2077 otherwise im not too particularly fussed on vram as long as its not below 8.
It is a personal thing ofc, but I'd want a new card intended for 1440p to be averaging 100+ FPS in TPU's test suite with the highest settings. If you look at the graph above the results, they show that some games are dipping considerably below the average, so my thinking is that an average of 100 fps is giving you some headroom to lower the settings a little and maintain above 60 fps for the next 3-5 years.

If the card is already below an average of 100 fps, I think that suggests it is going to age quickly at that resolution.

For reference, the 7700 XT hit 103.5 FPS and the 6800 (in the same review, here) 100.7 FPS and I consider those cards more or less the entry point for 1440p now. The 4060 hit 69.6 FPS which suggests right now it can handle 1440p in the majority of games just fine, but in the longer-term I think it will fall below decent playability in newer games (especially AAA) sooner rather than later. For a casual gamer playing older games then sure, it would do the job.
 
Last edited:
I'd say that (this is from a POV of performance, not comparing used prices to new):

For the same price: a 3060 12GB is a better card for creators, but the 4060 8GB is a better card for gamers. I can't remember his reasoning, but it does have higher performance and lower power consumption. You can skip to the final thoughts with the timestamps:

The 4060 used to be more expensive, like £300+ and the 3060 12GB available near £250, so it was a different choice then, though I've seen the 3060 12GB come down toward £200 in the last few days.

The 3060 Ti and 4060 Ti are pretty close in performance, but it is consistently faster in a PCI-E 4.0 board and more energy efficient. Nvidia cut the lanes to 8 (which doesn't help in PCI-E 3.0 boards) and cut down the bus, which means that it loses performance against the 3060 Ti as the resolution goes up.

The 4070 I'd consider just better than the 3070. It has significantly more performance, higher VRAM and modest power consumption. The same goes for the 4070 Super, which in the absence of deals for the 4070 is looking the better buy now that you can get one around £500.


It is a personal thing ofc, but I'd want a new card intended for 1440p to be averaging 100+ FPS in TPU's test suite with the highest settings. If you look at the graph above the results, they show that some games are dipping considerably below the average, so my thinking is that an average of 100 fps is giving you some headroom to lower the settings a little and maintain above 60 fps for the next 3-5 years.

If the card is already below an average of 100 fps, I think that suggests it is going to age quickly at that resolution.

For reference, the 7700 XT hit 103.5 FPS and the 6800 (in the same review, here) 100.7 FPS and I consider those cards more or less the entry point for 1440p now. The 4060 hit 69.6 FPS which suggests right now it can handle 1440p in the majority of games just fine, but in the longer-term I think it will fall below decent playability in newer games (especially AAA) sooner rather than later. For a casual gamer playing older games then sure, it would do the job.
Interesting that and as for the pcie interface you speak of, the motherboard is a Gigabyte A520M, i believe the K V2(not at home to check) which i dont plan on changing unless it fails being a used motherboard when i got it, so i guess something to work with that if theres much difference between that and the better non A boards?

If i had to put my usage in to percentage then i guess around 50% will be gaming, likely less. I want a 2 monitor setup, so i will keep my 4k 60hz samsung for everything, but get a 1440p possibly 170-200hz maybe even just a 144hz for the gaming side. 100fps would be a nice target for the more demanding games.

Trouble i sometimes find is a lot of the higher end cards are triple fans or the size of a triple fan. my case supports long cards, but it is also a small case, so looking at 2 fan variants only.
 
Interesting that and as for the pcie interface you speak of, the motherboard is a Gigabyte A520M, i believe the K V2(not at home to check) which i dont plan on changing unless it fails being a used motherboard when i got it, so i guess something to work with that if theres much difference between that and the better non A boards?

If i had to put my usage in to percentage then i guess around 50% will be gaming, likely less. I want a 2 monitor setup, so i will keep my 4k 60hz samsung for everything, but get a 1440p possibly 170-200hz maybe even just a 144hz for the gaming side. 100fps would be a nice target for the more demanding games.

Trouble i sometimes find is a lot of the higher end cards are triple fans or the size of a triple fan. my case supports long cards, but it is also a small case, so looking at 2 fan variants only.

AMD runs COD better and generally under DX12 AMD cards do better on slower CPUs,the AMD cards have lower driver overhead too. As an RTX3060TI owner myself,I would get an AMD card for COD.
 
Last edited:
Back
Top Bottom