• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What Is The Best GPU That My PC Can Run At Its %100 Capacity?

Associate
Joined
17 Aug 2023
Posts
14
Location
Istanbul-Turkiye
Hi All

I have a pretty old PC and I want to replace its Nvidia GTX 970 GPU with a better GPU. My PC consists of....

- Asus Z170P Mainboard
- Intel I5 6600K CPU (OC @4.1 Ghz)
- 16 GB Kingston DDR4 2666 Mhz Ram
- 650W Glacial Power PSU

I want to learn, what is the best GPU that my PC can run at its %100 capacity? I don’t want to make a dead investment by purchasing a GPU that my PC can’t run at its %100 capacity.

Another question... As you know, some of the PC games emphasis on CPU and GPU in a balanced amount but some other games emphasis more on GPU and less on CPU. So, considering this second type of games, is it something logical to put a GPU in a PC which the general system can not always work its %100 capacity?

Thanks
 
A 6700 XT would be the highest I'd go (~£300) and that's still a great upgrade on a 970.

Another question... As you know, some of the PC games emphasis on CPU and GPU in a balanced amount but some other games emphasis more on GPU and less on CPU. So, considering this second type of games, is it something logical to put a GPU in a PC which the general system can not always work its %100 capacity?

You can balance it out somewhat by playing at higher resolutions/settings, but it's just a fact that some games will run more poorly than they should. Would that stop me buying a decent graphics card? Nope, buy whatever is good value regardless of CPU.
 
Problem is lots more games have issues running on pure quad cores with no HT so it depends on what the OP wants to run. Also the PSU the OP using is maybe from 2008:

The GTX970 consumes around 160W average and nearly 200W peak:

The RX6700XT consumes more power and so does the RX6650XT.The RX6600XT slightly less. I would probably look at the RX6600 non-XT if the PSU is that old:
 
Last edited:
@Cat the Fifth
It is really hard to find a new 1151 socket CPU nowadays in my country and I couldn’t find a I7 7700k processor so far (I7 7700k is the best CPU that my mainboard supports). Also a secondhand I7 7700k would probably be used a lot and out of guarantee. Yes, my PSU is pretty old (from 2009) but it has never let me down so far although I have used different GPUs.

@Raumarik
The link is very benefical, thank you...

@Quartz
I generally play RPG games like Far Cry, Assassins Creed, GTA, Elder Scrolls series etc. As FPS, I sometimes play Call of Duty series games. My choices will probably be similar in the future. I play at 1920x1080p resolution and this is my monitor...
https://www.asus.com/displays-desktops/monitors/gaming/vg248qe/
 
Power supplies have capacitors which age as they get older,especially in warmer climates. The electrolyte slowly starts deteriorating.This decreases the power they can output safely(especially handling transient spikes which are more common with newer cards). It also appears to be one of the older split rail designs,so that actually also limits how much power it can supply to a graphics card.

An RX6700XT,RTX3060 or RX6650XT all will draw more power than a GTX970,so I would avoid them. The RX6600XT and RX7600 look similar in power consumption to your GTX970. However,as its an old PSU,I would go for a lower power consumption card just in case.

So the two I would look at are:
1.)RX6600 non-XT

power-consumption.png


2.)RTX4060 8GB

power-consumption.png


However,even though the RX6600 is slower,a few things favour it!

1.)In the UK it is around £100 cheaper than an RTX4060.
2.)It comes with a free copy of Starfield until 30th September if you buy from one of the listed retailers.


The retailers listed in Türkiye are:
Game Garaj(https://www.gamegaraj.com/amd-ryzen-radeon-starfield-bundle/)
Gaming.gen.tr(https://www.gaming.gen.tr/amd-ryzen-radeon-starfield-bundle)
Hepisburada(https://www.hepsiburada.com/staticpage/1105641967355023)
Incehesap(https://www.incehesap.com/icerik/secili-urunlerde-starfield-oyunu-hediye)
Itopya.com(https://www.itopya.com/amd-strafield-game-bundle_s199)
Sinerji Bilgisayar Electronics(https://www.sinerji.gen.tr/amd-starfield-game-bundle-cp-470)
Vatan Computer(https://www.vatanbilgisayar.com/secili-amd-urunlerinde-starfield-oyununa-sahipol/)

3.)COD works better on AMD cards. RX6600>RTX3060 and RX6600XT=RTX4060/RTX3060TI.

4.)AMD cards work better on older CPUs in modern DX12 and Vulkan games.



The Nvidia card in newer games will be held back by your CPU(which was the opposite of some older DX11 games).

Having said that I am looking at UK pricing.

What are the prices of the following cards in your country:
1.)RX6600
2.)RX6600XT
3.)RX6650XT
4.)RX7600
5.)RX6700XT

In the UK that is the pricing order from cheapest to most expensive.

Also how much are the following Nvidia cards:
1.)RTX3060
2.)RTX4060

Without knowing prices in your country,it is hard also to know what to recommend in terms of value for money.
 

Zotac RTX 3060ti = 195 pound​

Sapphire Rx 6700 xt Amd 12 GB GDDR6 = 188 Pound​

SAPPHIRE RX 6700 PULSE 10 GB = 159 Pound​

XFX Rx 6600 XT = 145 Pound​

XFX Speedster SWFT 210 Radeon RX 6600 = 130 Pound​

All used less than 1 year and still in the warranty cover.​

 

Zotac RTX 3060ti = 195 pound​

Sapphire Rx 6700 xt Amd 12 GB GDDR6 = 188 Pound​

SAPPHIRE RX 6700 PULSE 10 GB = 159 Pound​

XFX Rx 6600 XT = 145 Pound​

XFX Speedster SWFT 210 Radeon RX 6600 = 130 Pound​

All used less than 1 year and still in the warranty cover.​


RX6600 would probably be OK if you want to keep your PSU. If you can upgrade your PSU to a decent brand one,then the RX6700XT will have the greatest longevity in newer games. I have the RTX3060TI and does have better raytracing performance,and it is better in older games but 8GB of VRAM will hold it back in newer games.
 
Last edited:
Power supplies have capacitors which age as they get older,especially in warmer climates. The electrolyte slowly starts deteriorating.This decreases the power they can output safely(especially handling transient spikes which are more common with newer cards). It also appears to be one of the older split rail designs,so that actually also limits how much power it can supply to a graphics card.

An RX6700XT,RTX3060 or RX6650XT all will draw more power than a GTX970,so I would avoid them. The RX6600XT and RX7600 look similar in power consumption to your GTX970. However,as its an old PSU,I would go for a lower power consumption card just in case.

So the two I would look at are:
1.)RX6600 non-XT

power-consumption.png


2.)RTX4060 8GB

power-consumption.png


However,even though the RX6600 is slower,a few things favour it!

1.)In the UK it is around £100 cheaper than an RTX4060.
2.)It comes with a free copy of Starfield until 30th September if you buy from one of the listed retailers.


The retailers listed in Türkiye are:
Game Garaj(https://www.gamegaraj.com/amd-ryzen-radeon-starfield-bundle/)
Gaming.gen.tr(https://www.gaming.gen.tr/amd-ryzen-radeon-starfield-bundle)
Hepisburada(https://www.hepsiburada.com/staticpage/1105641967355023)
Incehesap(https://www.incehesap.com/icerik/secili-urunlerde-starfield-oyunu-hediye)
Itopya.com(https://www.itopya.com/amd-strafield-game-bundle_s199)
Sinerji Bilgisayar Electronics(https://www.sinerji.gen.tr/amd-starfield-game-bundle-cp-470)
Vatan Computer(https://www.vatanbilgisayar.com/secili-amd-urunlerinde-starfield-oyununa-sahipol/)

3.)COD works better on AMD cards. RX6600>RTX3060 and RX6600XT=RTX4060/RTX3060TI.

4.)AMD cards work better on older CPUs in modern DX12 and Vulkan games.



The Nvidia card in newer games will be held back by your CPU(which was the opposite of some older DX11 games).

Having said that I am looking at UK pricing.

What are the prices of the following cards in your country:
1.)RX6600
2.)RX6600XT
3.)RX6650XT
4.)RX7600
5.)RX6700XT

In the UK that is the pricing order from cheapest to most expensive.

Also how much are the following Nvidia cards:
1.)RTX3060
2.)RTX4060

Without knowing prices in your country,it is hard also to know what to recommend in terms of value for money.
THE graph G.O.A.T. ;) xoxox
 
Right now that's true but Hardware Unboxed for one seem to think that that's about to change:

I have run everything you can think of at ultra settings 4k native with RT on (or 4k high on things like tlou which will still do low to mid 60s with the latest patch) on my 4070 and my max usage for vram is 9.3-9.7gb... In all the latest titles... I only play at 1440p so I can't see 12gb being an issue, especially not with dlss3.0 for the next few years.

I've also messed around with UE5 stuff and again the reality seems different to the overhyped clearly sponsored journos/youtube shills.

When you do a side by side with someone with a higher vram card the game simply allocates more vram because it's there, however their utilisation seems to be higher than mine even at the same res/settings?

So either it's developer favouritism to nvidia in terms of optimisation/utilisation/allocation or the amd cards seem to just want to use a bit more in their method of raster? No bias as I'm actually an AMD/ATi guy by preference, I have a AMD cpu in both builds and a AMD card in my 2nd rig...

As with cars I often have multiple brands/models at once as they're good in their own individual ways, so in this case I just fancied trying out something different as the idea of the 200w tbp of the 4070 over 350-450w of a 6800-6950xt paired with 53-58c without the fans even spinning vs 80-110c of a 6950xt burning my leg was a no brainer at the same price...
Plus dlss3.0 gives me a very nice bit of future proofing headroom when it starts to not be able to handle a capped 60fps in SP at high/ultra... It's nice having RT on and not crippling the system was also a nice bonus.

The win factor for me that sold it was seeing I could run 1440-4k native at 130w ultra settings when undervolted without the fan even coming on at 53-58C (the fan comes on stock at 65C on my asus 4070 anyway) so when I'm playing 1440p native at ultra settings capped to 60fps for SP games (as I don't see the point if it's not MP of higher than 60fps) the 4070 sips power at around 105-120w!

My system is silent and uses £18-22 a month including amp/speakers/monitor playing it everyday 6hours of an evening :) it literally pulls 260-280w all in at the wall vs 350-450w for just the gpu of a 6xxx/7xxx high tier card at the same settings/res...
Then with those 6xxx/7900xt(x) gpu's paired with the rest of the components/monitor/speakers/amp ends up 5-600w easily and costing me double a month in electric! F that!

Which over the year is a lot of money wasted when you're in an area with a disgusting pricing for electric, and want to enjoy limitless play time on your new system that you've just built!

People may say it's very expensive for what it is but I've already got half my money back for that year in electric saved vs spending it on the rival card that can't do rt/dlss3.0/frame generation.
So by the 2nd year it's cost me half that in what I've saved in electric, by the third year it's free... And so forth and so forth, when dlss can't help it, it'll just go in my 2nd rig as an upgrade, no biggie!

And I'll stress this point again I am an AMD/ATI guy with AMD cpu's in both rigs and AMD gpu in the 2nd, but I can't argue with that featureset/performance/power consumption, I'd have bought this regardless of who made it (just like with cars), imho it's as impressive as the rx6600xt was for it's time punching way above it's weightclass for it's power consumption! There literally ISN'T another card that can sip 105-135w at ultra 1440p-4k native and produce the fps this can, let alone with frame generation/rt/dlss on and still not go above that wattage!

A LOT of people don't realise what their system is costing them each month let alone year, yet complain about overpriced hardware, kind of ironic, no?

Oh and it goes without saying if I'd not have been happy I'd of sent it straight back, what with everything having the no questions asked 14 day return/refund policy!
 
Last edited:
I have run everything you can think of at ultra settings 4k native with RT on (or 4k high on things like tlou which will still do low to mid 60s with the latest patch) on my 4070 and my max usage for vram is 9.3-9.7gb... In all the latest titles... I only play at 1440p so I can't see 12gb being an issue, especially not with dlss3.0 for the next few years.

I've also messed around with UE5 stuff and again the reality seems different to the overhyped clearly sponsored journos/youtube shills.

When you do a side by side with someone with a higher vram card the game simply allocates more vram because it's there, however their utilisation seems to be higher than mine even at the same res/settings?

So either it's developer favouritism to nvidia in terms of optimisation/ultisation/allocation or the amd cards seem to just want to use a bit more in their method of raster? No bias as I'm actually an AMD/ATi guy by preference, I have a AMD cpu in both builds and a AMD card in my 2nd rig...

As with cars I often have multiple brands/models at once as they're good in their own individual ways, so in this case I just fancied trying out something different as the idea of the 200w tbp of the 4070 over 350-450w of a 6800-6950xt paired with 53-58c without the fans even spinning vs 80-110c of a 6950xt burning my leg was a no brainer at the same price...
Plus dlss3.0 gives me a very nice bit of future proofing headroom when it starts to not be able to handle a capped 60fps in SP at high/ultra... It's nice having RT on and not crippling the system was also a nice bonus.

The win factor for me that sold it was seeing I could run 1440-4k native at 130w ultra settings when undervolted without the fan even coming on at 53-58C (the fan comes on stock at 65C on my asus 4070 anyway) so when I'm playing 1440p native at ultra settings capped to 60fps for SP games (as I don't see the point if it's not MP of higher than 60fps) the 4070 sips power at around 105-120w!

My system is silent and uses £18-22 a month including amp/speakers/monitor playing it everyday 6hours of an evening :) it literally pulls 260-280w all in at the wall vs 350-450w for just the gpu of a 6xxx/7xxx high tier card at the same settings/res...
Then when paired with the rest of the components/monitor/speakers/amp ends up 5-600w easily and costing me double a month in electric! F that!

Which over the year is a lot of money wasted when you're in an area with a disgusting pricing for electric and want to enjoy limitless play time on your new system you've just built!

People may say it's very expensive for what it is but I've already got half my money back for that year in electric saved vs spending it on the rival card that can't do rt/dlss.
So by the 2nd year it's cost me half that in what I've saved in electric, by the third year it's free... And so forth and so forth, when dlss can't help it, it'll just go in my 2nd rig as an upgrade, no biggie!

And I'll stress this point again I am an AMD/ATI guy with AMD cpu's in both rigs and AMD gpu in the 2nd, but I can't argue with that featureset/performance/power consumption, I'd have bought this regardless of who made it (just like with cars), imho it's as impressive as the rx6600xt was for it's time punching way above it's weightclass for it's power consumption! There literally ISN'T another card that can sip 105-135w at ultra 1440p-4k native and produce the fps this can, let alone with rt/dlss on and still not go above that wattage!

A LOT of people don't realise what their system is costing them each month let alone year, yet complain about overpriced hardware, kind of ironic, no?

Except nobody cared 6 months ago,when most AMD cards consumed less power than the Ampere cards. This is indicated by my posts which showed how much more power even an RTX3060,etc consumes than an RX6600 or RX6600XT. The RTX3050 consumes more power than an RX6600 and is slower.

AMD had Radeon Chill since 2016 too.

But how many people probably will buy an RTX3050 over an RX6600 even now? You could make an excuse in 2021,when AMD cards were hard to get but since last year onwards?

Europe had energy prices increasing since late 2021.

Moreover,all the power saving in the world won't mean anything if Nvidia makes cards like the RTX4070TI 12GB. OFC,it will draw less power than an RX7900XT because it has 60% of the VRAM.

Also,most people keep their cards for between 3~5 years. So if you run out of VRAM,not only will you save power by your performance being rubbish,you will have to upgrade the card earlier. You can always ramp up textures when you have enough VRAM on a slower card. A GTX960 4GB would be able to run higher quality textures than a GTX1060 3GB.

This is what HUB is pointing out and they are correct.

In my case,I got an RTX3060TI 8GB because AMD cards were hard to get at RRP in the UK in 2021. But now I am running out of VRAM and will have to upgrade sooner rather than later.You can always undervolt/mod cards to draw less power - I do it with my RTX3060TI in a 12 litre mini-ITX case.

Trash like the RTX4060TI 8GB might draw less power than an RX6700XT,but it costs more. So even if the RX7700XT is not much faster than an RX6700XT and draws more power than an RTX4060TI 8GB,who in their right mind would get such a card?

But you can't mod more VRAM in. How many people used power consumption as an excuse not to buy an R9 390 8GB over a GTX970 4GB? If these were mini-ITX users I might understand,but people in systems with massively overclocked CPUs? What has lasted longer?

How much does a new card cost? £100s. How is the resale value of VRAM limited cards - not great because people will sell them en mass when it obvious they are EOL.

This is coming from a person who actually does care about heat production because of my tiny PC. I built nothing but SFF systems since 2005!

An RX6700XT is the obvious choice over an RTX4060,especially as they are priced so similar in the UK.
 
Last edited:
Except nobody cared 6 months ago,when the AMD cards consumed less power than the Ampere cards. This is indicated by my posts which showed how much more power even an RTX3060,etc consumes than an RX6600 or RX6600XT.

Europe had energy prices increasing since late 2021.

Moreover,all the power saving in the world won't mean anything if Nvidia makes rubbish like the RTX4070TI 12GB. OFC,it will draw less power than an RX7900XT because it has 60% of the VRAM.

Also,most people keep their cards for between 3~5 years. So if you run out of VRAM,not only will you save power by your performance being rubbish,you will have to upgrade the card earlier.

In my case,I got an RTX3060TI 8GB because AMD cards were hard to get at RRP in the UK. But now I am running out of VRAM and will have to upgrade sooner rather than later.

Trash like the RTX4060TI 8GB might draw less power than an RX6700XT,but it costs more. So even if the RX7700XT is not much faster than an RX6700XT and draws more power than an RTX4060TI 8GB,who in their right mind would get such a card?

You can always undervolt/mod cards to draw less power - I do it with my RTX3060TI in a 12 litre mini-ITX case.

But you can't mod more VRAM in. How many people used power consumption as an excuse not to buy an R9 390 8GB over a GTX970 4GB? If these were mini-ITX users I might understand,but people in systems with massively overclocked CPUs? What has lasted longer?
We've always got on well me and you, and both listen/respect each others experiences.. so I'll add mine :)

If you remember I cared 18 months ago and people on here laughed at me if you remember, saying 8gb would last forever at 1440p-4k :cry:

Yeah I wouldn't of paid for a Ti... I had the money for a 7900xt and wanted that originally, but seeing the power consumption of that even at 1080p vs a 4070 at the same settings it was a no brainer.

As I say, from real world testing vs relying on shills that are clearly sponsored online in reviews i've noticed I use around 1-2gb less than any rival or higher tier amd card, so they're either majorly favouring coding games to nvidia or amd cards just operate differently achieving the same or higher fps and are more vram hungry, no idea but I've literally seen it in person with rx6800xt vs mine for example and it makes no sense when both are at the same res/settings yet mine will use 9.3-9.7gb in the most demanding ott settings at 1440p or 4k WITH RT ON... And they're using 11-12gb?
So that means I'll be using 11-12gb when they'll be using 14-15gb in the future.

As we both know a 3080 is a solid card for it's time excluding the price, and this does what that does but better in places whilst using less power/temps/fans... So for me despite the model tier naming it was a no brainer, as I have no interest in the whole this should have been called a X card not a Y, as both teams are guilty of that, just look at the 7900xt that's a 7800xt with a dress on!

As I said in my post mate, when it gets to the point dlss can't save it at 1440p at a capped 60fps mid-high settings for SP games, then it'll go in my 2nd rig as an upgrade, and by that point it'll have cost me nothing...
As after the 2nd year the money I save in electric vs having a 6800/6950/7900xt paired with the rest of my system + speakers/amp/monitor... It'll have been paid back!

Remember my total draw at the wall with monitor/speakers/amp/pc is only 260-280w at 1440p or 4k native ultra, and at 1440p dlss it's even less... A 6800/6900/6950/7900xt on it's own uses 350w+ at 4k, and that's excluding the rest of their components + monitor/speakers/amp! F that!

Even underclocked they CANNOT knock off 200w+ and get even close to my stock tdp at 1440p/4k high/ultra native, I've seen friends builds recently in person and yeah, not possible, anyone that's not biased will happily admit that.

Yeah I totally get why you got that card based on your situation, I considered one at the time too but choose to wait till this year to see what was what.

Yeah IMHO I'd have a 6700XT over a 4060ti 16gb :cry:

Haha it's funny you should mention MITX, my SFF 2nd system is MITX, so that played a part in getting extra life out of the 4070 as I went for an Asus Dual which is very small for sucha powerful card, and will happily fit in my 2nd rig when the time comes :) then that'll give that life for further years for it's chosen usage of being a tv media playback/emulation rig.

So I can't loose really. Especially when it's costing me peanuts a month to run with my entire setup including speakers/amp/monitor using less than even an undervolted RX6950XT/7900XT at 4k that can't do RT/frame gen/dlss3.0!

As I mentioned before I'm whiling to give stuff a chance as with different car manufacturers vs being narrowminded to 1 brand, I will say this I've had to eat my words with dlss3.0 as I wasn't convinced before, but it's VERY hard to tell on a up to date patched modern or vanilla game, I literally messed about with my 4k tv playing with native 4k and 4k dlss along with playing 1440p native on my new monitor etc etc and yeah, you'd really HAVE to know where to look to know, if you notice at all. So many older games are now adding support to it (not that I care as they wouldn't struggle running native, but it's nice they're bothering regardless)

So TLDR, after 3 years if it can't play nice with the aids of dlss3.0/frame gen etc at mid-high 1440p it'll just go in the 2nd sff mitx tv rig as a bonus upgrade and by that third year ill have saved £264+ on what I would have burned per year extra running a rx6950/7900xt when paired with the rest of the system/speakers/amp/monitors power draw, so I doubt I'll be that bothered come year 4 as that's over £500 that would have been wasted on electricity giving me effectively a free gpu, vs by year 4 on an RX AMD card i'd have no money to spend and the same performance or worse if you account for lack of RT/dlss... Win Win.

I'd of bought it whoever made it mate, as with a car. It majorly reminds me of how wicked the rx6600xt was for it's power consumption/undervolting vs performance.

Oh and I've got a 3 year warranty (which the ref amd 6950xt that runs stupid hot/loud/coilwine fest doesn't nor does the sapphire 7900xt I would have chosen) and I haven't got to faf with sending it myself abroad etc and it's silent, the fan never kicks in and it only gets to 53-57 so my rig's fan's barely kick in and my room isn't a sweatbox, and this all whilst sat next to an airing cupboard...
 
Last edited:
Back
Top Bottom