• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is the 3060ti overkill for 1080p?

rn2

rn2

Associate
Joined
13 Mar 2017
Posts
523
Location
England
I was going to buy a 3070 to pair with a 5900x but I've been advised that it can cause games to run badly with it being too powerful for the resolution and the cpu having to work harder. I still don't fully understand the technicals around it. But I'm starting to away towards the 3060ti. Would it still be overkill? I guess maxing out the settings would fix things easier than on a 3070. I just don't want to buy the wrong card. P.S. I'm very happy with my 1080p monitor.. Its gsync and very good quality.
 
Soldato
Joined
18 Oct 2002
Posts
4,515
Just going off the fact that the 2070 Super is considered overkill for 1080p, and the 3060 TI is slightly more powerful, then yeah you could probably say it is overkill.

However, I don't think you can go wrong with the 3060 TI right now if you can find one at close to MRRP. I'd say it is certainly the card to get from the green team right now for anyone who isn't bothered about 4 or 8k and maxed out settings. I can barely tell the difference when a game is on Ultra or Low settings these days anyhow, though that'd probably to do with having a 40 year old set of eyes.
 

ljt

ljt

Soldato
Joined
28 Dec 2002
Posts
4,540
Location
West Midlands, UK
There is no such thing as overkill. Either one of those cards would be great for high refresh 1080p. Your CPU is also top end too so no problems there. You'll also be able to pretty much max out ray tracing effects in games that support it too which is a bonus!
 

rn2

rn2

Associate
OP
Joined
13 Mar 2017
Posts
523
Location
England
I think that I figured it out. Someone said that the cpu will bottle nck the gpu. But they assumed that I'd be playing at 100s of frames per second. I'm quite happy to limit my frames to 144fps, am I right in saying that the cpu won't be bottlenecking the gpu in this instance?

I think. They also assumed that I will have an old cpu.
 
Associate
Joined
20 Jul 2015
Posts
386
they assumed a lot. And so got a lot wrong.

5900X one of the best gaming CPU's around - and, probably overkill. 3070 would be a good match for it, and it's easy to move up from 1080p :)
 
Soldato
Joined
24 Jan 2006
Posts
2,524
My 3060Ti gets ~ 60-70fps in CP2077 RT Ultra 1080p.

If that's an example of a 'current gen game' seems like 3060Ti is entry level for RT @ 1080p.
 

rn2

rn2

Associate
OP
Joined
13 Mar 2017
Posts
523
Location
England
My 3060Ti gets ~ 60-70fps in CP2077 RT Ultra 1080p.

If that's an example of a 'current gen game' seems like 3060Ti is entry level for RT @ 1080p.

Im not interested in RT so I should get even more frames. Maybe I'll stick with a 3060ti. For you to get that many frames they must have fixed some issues with the games performance recently?
 
Soldato
Joined
24 Jan 2006
Posts
2,524
Im not interested in RT so I should get even more frames. Maybe I'll stick with a 3060ti. For you to get that many frames they must have fixed some issues with the games performance recently?
AFAIK the default RT Ultra uses DLSS so not that impressive at 1080p with DLSS

3060Ti is a decent enough card, 3070 only 10% faster for 25% more cash.
 
Associate
Joined
29 Aug 2013
Posts
1,175
16TFLOPS -> 20TFLOPS is 25% increase in raw performance.
3080 has over double the TFLOPS of the 2080ti yet is 30% faster not 100%. Ampere seems to have inflated TFLOPS that does not scale with gaming performance, something to do with the cuda core changes I think.
 
Soldato
Joined
22 Oct 2008
Posts
11,483
Location
Lisburn, Northern Ireland
I was going to buy a 3070 to pair with a 5900x but I've been advised that it can cause games to run badly with it being too powerful for the resolution and the cpu having to work harder. I still don't fully understand the technicals around it. But I'm starting to away towards the 3060ti. Would it still be overkill? I guess maxing out the settings would fix things easier than on a 3070. I just don't want to buy the wrong card. P.S. I'm very happy with my 1080p monitor.. Its gsync and very good quality.

If you can afford a 5900x and a 3070, I'd say you could afford a new 1440p 144hz screen :D
 
Associate
Joined
29 Aug 2012
Posts
84
Hi OP

I've just upgraded from a Pallit Super Jetstream GTX 980 to a Pallit Dual RTX 3060ti.

I actually game @ 1080p via a GSync monitor and the performance increase is very noticeable between the 2 cards. Every game I play now gives a butter smooth frame rate.

Hope that answers your question

:)
 

rn2

rn2

Associate
OP
Joined
13 Mar 2017
Posts
523
Location
England
Hi OP

I've just upgraded from a Pallit Super Jetstream GTX 980 to a Pallit Dual RTX 3060ti.

I actually game @ 1080p via a GSync monitor and the performance increase is very noticeable between the 2 cards. Every game I play now gives a butter smooth frame rate.

Hope that answers your question

:)
Thank you:)
 
Associate
Joined
30 Nov 2020
Posts
6
Location
Nottingham
I was going to buy a 3070 to pair with a 5900x but I've been advised that it can cause games to run badly with it being too powerful for the resolution and the cpu having to work harder. I still don't fully understand the technicals around it. But I'm starting to away towards the 3060ti. Would it still be overkill? I guess maxing out the settings would fix things easier than on a 3070. I just don't want to buy the wrong card. P.S. I'm very happy with my 1080p monitor.. Its gsync and very good quality.


Faster hardware will never make your gaming experience worse than slower hardware.

I see a lot of people who don't understand how it all works, so I'll explain! Sorry if any of this feels patronising to read, it's not my intention.

Your monitor's resolution describes how many pixels it has, and how they're laid out.
Your monitor's refresh rate describes how many times the pixels change each second. A 60Hz monitor will update each pixel 60 times per second, to display 60 different images each second.
Most people see a big improvement moving from 60Hz>144Hz, and further improvements quickly diminish.
The improvement gets much smaller as the refresh rate increases further.
The number of frames you will see in a game is limited by your monitor's refresh rate. It doesn't make a meaningful difference whether you get 60fps or 1000fps on a 60Hz display, you're limited to seeing 60 new frames every second.

The Graphics card's job is to figure out what the game should look like, and display that on your monitor.
It does this by colouring in the pixels.
Each graphics card has a limit to the number of pixels it can fill in a given period of time. Stronger graphics cards fill more pixels.

If your monitor has a higher resolution, each image has more pixels. This means that in a set period of time (like 1 second), your graphics card may draw the same number of pixels, but those pixels will be drawn across fewer frames. If you double the pixel count per second, you halve the number of frames per second.

A 1080p (1920x1080) frame will have 1/4 the pixels of a 4k (3840x2160) frame, so a graphics card could draw 4x the frames per second. This is why you may have heard people saying that a 3060Ti is "overkill" for 1080p, because it's more than capable of running games at 1080p at refresh rates 144Hz+. The issue with this claim is that different games are, well, different. On a 3060Ti, most games will run 1080p @ 144Hz/144fps on ultra settings, and still have plenty of headroom, but not all games will run the same. A sleek, competitive game like CSGO might draw 600-700fps @1080p with the same GPU. The most demanding games today (like Cyberpunk, for example) don't manage to hit 144fps on high/ultra settings at 1080p, so in that case the 3060Ti definitely isn't "overkill".


Now, what does the CPU do?

The CPU calculates everything necessary for the game's world. It calculates positions and trajectories of moving objects, all that sort of thing.

It would be a waste to do this constantly, so the CPU waits until the GPU is ready to draw another frame.
At this point the CPU does all of its calculations, passes the necessary data to the GPU, and the GPU can figure out how to display it all on your monitor.
The CPU does work for each frame that is drawn. More frames = more work for the CPU.

At 1080p, you've got less pixels in each frame, so your frames per second will be higher, this means more work for the CPU.
At 4k, your fps will be much lower, so the CPU has much less work to do.

Stronger CPUs provide the potential for higher FPS. They effectively raise the FPS-ceiling.
At lower resolutions like 1080p, you do generally want a stronger CPU to ensure you benefit from higher frame rates, but with your 5900X, that's about the best you can get right now.

The 3060Ti is certainly a very strong 1080p card, and may be overkill for quite a few games, particularly competitive games which tend to run on low-end hardware, but that doesn't mean it's a bad GPU for 1080p.
It certainly won't make anything worse, it'll just keep trying its hardest to draw as many frames as possible.

You absolutely don't need a 3060Ti for 1080p, but it's not going to make things worse, and it's not so "overkill" that it's just a total waste of money.
If you're more of a fortnite/CSGO/Rocket league player, it's maybe a bit expensive for what you need.
If you're a big fan of pretty open world games, or AAA titles in general, it's a good choice.
 
Back
Top Bottom