• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What Is The Best GPU That My PC Can Run At Its %100 Capacity?

Zotac RTX 3060ti = 195 pound​

Sapphire Rx 6700 xt Amd 12 GB GDDR6 = 188 Pound​

SAPPHIRE RX 6700 PULSE 10 GB = 159 Pound​

XFX Rx 6600 XT = 145 Pound​

XFX Speedster SWFT 210 Radeon RX 6600 = 130 Pound​

All used less than 1 year and still in the warranty cover.​

The 6700 Pulse 10 GB looks like the best deal by far. Great price/perf. Definitely save up for a CPU upgrade though, for the games you play it's going to matter a lot. I'm sure there's gotta be some second-hand deals for an AM4 mobo + like an R5 5600 for ~$100.
 
I cared 18 months ago and people on here laughed at me if you remember, saying 8gb would last forever at 1440p-4k :cry:

Yeah I wouldn't of paid for a Ti... I had the money for a 7900xt and wanted that originally, but seeing the power consumption of that even at 1080p vs a 4070 at the same settings it was a no brainer.

As I say, from real world testing vs relying on shills that are clearly sponsored online in reviews i've noticed I use around 1-2gb less than any rival or higher tier amd card, so they're either majorly favouring coding games to nvidia or amd cards just operate differently achieving the same or higher fps and are more vram hungry, no idea but I've literally seen it in person with rx6800xt vs mine for example and it makes no sense when both are at the same res/settings yet mine will use 9.3-9.7gb in the most demanding ott settings at 1440p or 4k...
So that means I'll be using 12gb when they'll be using 14-15gb in the future.

As I said in my post mate, when it gets to the point dlss can't save it at 1440p at a capped 60fps mid-high settings, then it'll go in my 2nd rig as an upgrade, and by that point it'll have cost me nothing... As after the 2nd year the money I save vs having a 6800/6950/7900xt paired with the rest of my system + speakers/amp/monitor... Remember my total draw at the wall with monitor/speakers/amp/pc is only 260-280w at 1440p or 4k native ultra, and at 1440p dlss it's even less... A 6800/6900/6950/7900xt does use 350w+ at 4k, and that's excluding the rest of their components + monitor/speakers/amp! F that!

Even underclocked they CANNOT knock off 200w+ and get even close to my stock tdp at 1440p/4k high/ultra native, I've seen friends builds recently in person and yeah, not possible, anyone that's not biased will happily admit that.

Yeah IMHO I'd have a 6700XT over a 4060ti 16gb :cry:

I have no interest in the whole this should have been called a X card not a Y, as both teams are guilty of that, just look at the 7900xt that's a 7800xt with a dress on!

Haha it's funny you should mention MITX, my SFF 2nd system is MITX, so that played a part in getting extra life out of the 4070 as I went for an Asus Dual which is very small for sucha powerful card, and will happily fit in my 2nd rig when the time comes :) then that'll give that life for further years for it's chosen usage of being a tv media playback/emulation rig.

So I can't loose really. Especially when it's costing me peanuts a month to run with my entire setup including speakers/amp/monitor using less than even an undervolted RX6950XT/7900XT at 4k that can't do RT/frame gen/dlss3.0!

As I mentioned before I'm whiling to give stuff a chance as with different car manufacturers vs being narrowminded to 1 brand, I will say this I've had to eat my words with dlss3.0 as I wasn't convinced before, but it's VERY hard to tell on a up to date patched modern or vanilla game, I literally messed about with my 4k tv playing with native 4k and 4k dlss along with playing 1440p native on my new monitor etc etc and yeah, you'd really HAVE to know where to look to know, if you notice at all. So many older games are now adding support to it (not that I care as they wouldn't struggle running native, but it's nice they're bothering regardless)

So TLDR, after 3 years if it can't play nice with the aids of dlss3.0/frame gen etc at mid-high 1440p it'll just go in the 2nd sff mitx tv rig as a bonus upgrade and by that third year ill have saved £264+ on what I would have burned per year extra running a rx6950/7900xt when paired with the rest of the system/speakers/amp/monitors power draw, so I doubt I'll be that bothered come year 4 as that's over £500 that would have been wasted on electricity giving me effectively a free gpu, vs by year 4 on an RX AMD card i'd have no money to spend and the same performance or worse if you account for lack of RT/dlss... Win Win.

I'd of bought it whoever made it mate, as with a car. It majorly reminds me of how wicked the rx6600xt was for it's power consumption/undervolting vs performance.

I am not considering the RTX4070 because it is actually the RTX4060TI 12GB,ie,its a similar uplift(around 40%),that the RTX2060 Super to RTX3060TI was,and the GTX1070 to the RTX2060 Super was. The RX7800XT is really the RX7700XT,as it will be give a similar uplift over the RX6700XT(around 40% if the estimated performance leaks are correct). So basically Nvidia/AMD is adding £100~£200 extra to the price. So they need to be under £500 at most.

When you think of it this way,this is why some of the cards look good in power consumption because they are mainstream parts.

So AMD will spin that the RX7800XT looks better than the RX6800XT because it looks more efficient. But they want to sell for over £500 I suspect,5 months after the RTX4070.

You can't just compare the RTX4000 parts only to RDNA2 - a lot of the people online seem to quietly ignore the loads of RTX3000 parts around. In fact stock of RTX3000 parts worldwide is much higher in many markets. Plenty of people are also buying them new and secondhand too because AMD simply supplies less markets.

Nvidia marketing has pushed all this power consumption talk mostly against the AMD parts,but far less talk is about markets still saturated with lots of RTX3050/RTX3060/RTX3060TI/RTX3070 cards. Many people could have saved money on power consumption just buying mainstream AMD parts instead of Nvidia ones upto April this year. But they still would buy an RTX3070TI over an RX6800,or an RTX3050 over an RX6600.

The RTX3070TI was apparently a 16GB part originally:

That is the problem - Nvidia saves on VRAM because it saves on cost and POWER consumption. What I found very suspicious is a lot of these new games going over 8GB,is they seemed to only do so when Nvidia started releasing the RTX4000 series. Then launch a 16GB RTX4060TI too.

Its almost like the want to make the RTX4060/RTX4060TI so poor,they upsell you to an RTX4060TI 16GB/RTX4070 12GB.

I was annoyed with AMD and its useless launches,but the RTX4060TI 8GB is an Apple level cynical move - its a GT1030 DDR4 move IMHO. They get away with it because AMD can't their own house in order either.

No dGPU released over £300 should have under 12GB of VRAM. Anything over £600 should have more than 12GB IMHO.
 
Last edited:
I am not getting the RTX4070 because it is actually the RTX4060TI 12GB,ie,its a similar uplift(around 40%),that the RTX2060 Super to RTX3060TI was,and the GTX1070 to the RTX2060 Super was. The RX7800XT is really the RX7700XT,as it will be give a similar uplift over the RX6700XT(around 40% if the estimated performance leaks are correct). So basically Nvidia/AMD is adding £100~£200 extra to the price.When you think of it this way,this is why some of the cards look good in power consumption because they are mainstream parts.

So AMD will spin that the RX7800XT looks better than the RX6800X because it looks more efficient.

You can't just compare the RTX4000 parts to RDNA2 - a lot of the people online seem to quietly ignore the loads of RTX3000 parts around. In fact stock of RTX3000 parts worldwide is much higher in many markets. Plenty of people are also buying them secondhand too.

Nvidia marketing has pushed all this power consumption talk mostly against the AMD parts,but far less talk is about markets still saturated with lots of RTX3050/RTX3060/RTX3060TI/RTX3070 cards.That is the problem - Nvidia saves on VRAM because it saves on cost and POWER consumption. But the issue you know what will happen in 18 months time,they will make the RTX5060 12GB and RTX5070 16GB.What I found very suspicious is a lot of these new games going over 8GB,is they seemed to only do so when Nvidia started releasing the RTX4000 series.
FYI I just edited my original post to make it a bit more legible as was typing whilst eating originally :cry: :D So if it was a bit spangled that's why haha! - I'm still eating so be warned with this one :cry:

Yeah in your situation that makes perfect sense. For me I have a 3500x/RX580 8gb/16gb 2666mhz, b450i as my 2nd rig which I only got super cheap off a friend whilst half way through building this/awaiting stock for the final parts - so I hadn't had a new pc in 15 years other than macs used for workhorses/browsing/media consumption until 1/2's of the way through building this all new rig with the 4070, so in my usage this was a no brainer, and the cool thing was I got the all amd sff to play with whilst I waited for stock and mulled over what parts to go for to finalise it vs stock/restocking/retiring models due to new ones.

I personally always cared about the power usage, even when I was younger and would leave my rigs on that were heavily overclocked I'd be using £12 a week on the bloody meter card :cry:
Which quickly led me to building an XP-M overclocked SFF htpc for dls etc...

If you remember I had already swapped out all my PS consoles to slim varients and switched to using my gaming monitor over my tv, so this was just something I was into and it saved me money that could be spent on cars/life in general, as I love getting my bang per buck/tweaking stuff to suit my needs.

I see what you're saying mate but I've seen people with 4090's manage to make pretty impressive undervolts, like 100-120w less, and they've got 24gb of vram?
I haven't seen that with binned roided up cards like the 6950xt that literally uses upto 140w more than a 6900xt... Which is quite frankly ridiculous considering a 7900xt only uses what 12-20w more than a 6800xt...

I for example build silly cars with engine swaps etc and if I could make them run the same power on half the 99 octane fuel... I'd be stupid not to wouldn't I, so despite wanting a 6950xt/7900xt, buying a 4070/5700x over a 5800x3d/6950xt/7900xt was a no brainer, as both run silent/cool/undervolt like beasts and cost me sod all to run and minal amounts if I use them excessively daily (which I like to :D) As you know I had the budget for way more than what I bought, it just didn't make sense in my situation.

I think it's hilarious when people say oh I don't care what it costs to run a month then you say, how much are you paying per kw/h and add up their rig + speakers/amp/monitor and they soon are shocked how much wattage/money that costs over the year! Then it makes sense to them...

But yeah TLDR I'll never agree with Nvidia's pricing nor the whole sneaky names game both amd and nvidia play but if amd/intel/dave down the road made this i'd of still bought it based on my requirements vs performance, as I say with cars it's just the same, I love my jdm and I love my old dtm stuff, I'll never be narrowminded and let others tell me I can't like both, I think that's bordering on brain damage mentality personally!

FWIW I came £600 under budget, got what I wanted in terms of my own strict requirements i.e. quiet/silent and uses FA electric yet packs a punch at my chosen res/settings, managed to buy a 2nd rig which helped a mate in need out AND got myself a brand new 32" 1440p 165hz monitor that I wouldn't of been able to afford before within budget AND have nearly £200 left...
I even managed to change my mouse to something posher/big hand friendly/wireless and replace all my fans with silent ones!

Yet if I'd listened to the drivel online I'd of had a 5800x3d or AM5 that I didn't need burning my leg with a loud asf fan 6950xt rinsing my plug socket whilst burning my leg with less warranty and zero gains for my usage scenerio/requirements! Now I have a bonus monitor I didn't think I could afford/justify running due to the amd cards eating way more electric at that res than my original stay at 1080p plans, and a 2nd rig/quiet fans/new mouse/nearly £200 left over.

And the cherry on top being the cpu/ram/gpu will live in the SFF in the future as a free upgrade for it's usage is again winning, as I say by year 3 this is literally giving me free money on what I save having an amd card at these settings/res in electric!

Can't say fairer than that can you! Win WIN!
 
Last edited:
You have to consider I have had everything from Shuttle systems,numerous compact mini-ITX systems,etc. For most of the last 18 years I had sub 500W PSUs. I only have an SF600 now because I found one cheap. I tend to undervolt my CPUs and dGPUs,and run custom curves. I have saved over 50W just doing these tweaks on my CPU and GPU.

Also I cap FPS in games such as Fallout 4,so power consumption is well under 200W. WRT,to the Ryzen 7 5700X I have one,and an Ryzen 7 5800X3D would be a signficant upgrade even at qHD. In games such as Fallout 4 it the X3D CPUs are really are impressive:

My mate who has owned numerous RX6600 and RX6700 series has massively dropped power. Same with a mate who had an RX6800(or was it an RX6800XT). People have dropped the power of the RX6600 series under 100W.AMD had things like Radeon Chill,which was a dynamic FPS cap utility which could drop power a lot. But consistently you find when AMD has an advantage it is always underplayed. I remember back to the Fermi days. You could get HD6850/HD5850 cards both secondhand or new for the price of a GTX460TI. But despite Nvidia sold loads of those cards,and they still ended up selling more despite worse power draw. Even during the GCN1.0 days,all the internet focused on the HD7970 series,which did consume more power than an GTX680(but it had far more VRAM),but seem to most ignore the other cards were more efficient:

Then when Nvidia made the GTX750TI,all the media went on how great it was. But ignored the overclocked ones,many of which were not bus powered. But forgot the HD7850(the earlier version of the R7 265) was often faster,and could be had for less money and came with game bundles with some decent games(so you could sell them). The R7 265 could be had for similar money too and that was a pre-overclocked HD7850 2GB.


But,but the 30W of power was more important for people for a card which consumes less power than an RX6600/RTX4060. Which might be a consideration if you have a Dell,but not when people were building systems with good enough power supplies. For whatever reason AMD when they are ahead in effiency seem to not be able to get much traction,but if Nvidia is everyone talks about it. OTH,looking at how they marketed RDNA3 I am not at all surprised.
 
Last edited:
You have to consider I have had everything from Shuttle systems,numerous compact mini-ITX systems,etc. For most of the last 18 years I had sub 500W PSUs. I only have an SF600 now because I found one cheap. I tend to undervolt my CPUs and dGPUs,and run custom curves.

My mate who has owned numerous RX6600 and RX6700 series has massively dropped power. Same with a mate who had an RX6800(or was it an RX6800XT). People have dropped the power of the RX6600 series under 100W.AMD had things like Radeon Chill,which was a dynamic FPS cap utility which could drop power a lot. But consistently you find when AMD has an advantage it is always underplayed. I remember back to the Fermi days. You could get HD6850/HD5850 cards both secondhand an new for the price of a GTX460TI. But despite Nvidia sold loads of those cards,and they still ended up selling more despite worse power draw. Even during the GCN1.0 days,all the internet focused on the HD7970 series,which did consume more power than an GTX680(but it had far more VRAM),but seem to most ignore the other cards were more efficient:

Then when Nvidia made the GTX750TI,all the media went on how great it was. But ignored the overclocked ones,many of which were not bus powered. But forgot the HD7850(the earlier version of the R7 265) was often faster,and could be had for less money and came with game bundles with some decent games(so you could sell them). But,but the 30W of power was more important for people for a card which consumes less power than an RX6600/RTX4060. Which might be a consideration if you have a Dell,but not when people were building systems with good enough power supplies. For whatever reason AMD when they are ahead in effiency seem to not be able to get much traction,but if Nvidia is everyone talks about it.
Haha I loveeee a Shuttle! They were so cool back in the day! Super compact lil cubes!

Yeah the 6600/6700 undervolt nicely, just the 6800/6900/6950/7900 don't enough for my needs, my rx580 8gb runs at 96w if I knock it down -200 on the memory and power limit it with a 60fps framecap for SP games, whilst being able to play basically anything other than CP/TLOU at 60fps. So I was VERY impressed with that for a 2nd hand pc tv usage build!

Yeah see I seem to see amd always has all the info broadly shared on it's successful tweaking - That's why I always wanted a 6600xt but that time has passed vs gaming requirements now, as I was well keen on one of them due to the widely shared information on what it could achieve and undervolted wise.

Yet it seems to be harder to find stuff on nvidia as they just jump on the high horse of well I can afford to run it at X power so don't care whilst literally wasting money they could be saving on electric/temps/heat if undervolted and give you a god complex about the cost of it and it is THE best you can get, which isn't my thing, it has to be the best for what I want not the community/benchmarks...

I did love adrenaline too, bit bloated at version but the footprint doesn't seem to be, and it has everything you need, but I also don't mind the simplicity of afterburner either :)

Yeah I was gutted over the last 15 years I missed out on many cool AMD cards tbh, my last rig was a 1st gen q66 stepping quad core 2.4ghz at 3.8ghz stable on air at summit crazy like 58-64c thanks to a mad antec 100 case which had a 220 or 240mm top fan and multiple 120mm's all with fan controllers, with whatever the best amd dual gpu on 1 card was at the time - remember when that was a thing haha! It had an amazing hsf on it and a nob at the back of the pci-e slot to turn the fan down manually haha - remember those as with my socket-a turbine style coolermaster cooler on my 3200xp!

I totally know what you mean/agree though man, I've dipped in/out of the scene over the years whilst messing around with women/cars but got way more nosier when the 1xxx nvidia series era cards came out, but again cars came first so never bothered.

I tell you what does seem to punch very well above it's weight... This £5 posted off a forum friend EVGA GTX 660 2GB I got given for postage to use with my new rig till my gpu/psu/ram/motherboard was in stock, it plays a hell of a lot newer games than you'd think at max settings 60fps+ 1080p...
Genuinelly well impressed with that, as I only wanted it to have a play on cs:go to kill the time till my gpu etc etc came in stock as I was borrowing bits from my 2nd rig to make 1 usable system then made 2 fully working systems in the end.

:cry: Probably the best fiver I've ever spent on a pc part haha! But I'd of got the AMD alternative back in the day when it was new though, aint gonna pretend I wouldn't haha!


Oh and it goes without saying if I'd not have been happy with any of the build parts I'd of sent them straight back, what with everything having the no questions asked 14 day return/refund policy! There's no way I'd 'put up with it' when it's built entirely for gaming and nothing else!
 
Last edited:
Nvidia marketing has pushed all this power consumption talk mostly against the AMD parts,but far less talk is about markets still saturated with lots of RTX3050/RTX3060/RTX3060TI/RTX3070 cards. Many people could have saved money on power consumption just buying mainstream AMD parts instead of Nvidia ones upto April this year. But they still would buy an RTX3070TI over an RX6800,or an RTX3050 over an RX6600.
...
That is the problem - Nvidia saves on VRAM because it saves on cost and POWER consumption. What I found very suspicious is a lot of these new games going over 8GB,is they seemed to only do so when Nvidia started releasing the RTX4000 series. Then launch a 16GB RTX4060TI too.

This. Now add to the price saving from the latter statement and include the extra Ada price hike makes a wider delta for extra lols.
 
This. Now add to the price saving from the latter statement and include the extra Ada price hike makes a wider delta for extra lols.
I like that my new card was a rip off price and offends people, as I'ma mole/shill for AMD secretly as we all know, yet I'm sleeping with the enemy and liking the shex! :cry:

I need to learn to act more of a helmet though and start belittling my own people more, then I'll get my proper cult membership where we all add up what we've wasted on scenetax and cry into our own wrists in a group zoom meeting :D
 
Last edited:
Oh btw @CAT-THE-FIFTH @gpuerrilla my RX580 8GB crappy Powercolor decided it wanted to kill itself yesterday...

Probably to spite me for buying that 4070/or just cause it heard me saying how much of a nightmare I've had with that brand and decided to self harm, either or...

Good thing is it'll make a great video when me and my mate smash it to bits/throw it out of a window... :D :p

Due to this rudeness I've put an old but still faithful EVGA RTX 660 2GB I had in that 2nd ssf rig as it's replacement to annoy the rx580!
It was only a temp card to play csgo on whilst I was waiting for parts of the new build to come back in stock, but seems to do anything I need it to do and Linux Mint likes it unlike the rx580... Go figure.
 
Oh btw @CAT-THE-FIFTH @gpuerrilla my RX580 8GB crappy Powercolor decided it wanted to kill itself yesterday...

Probably to spite me for buying that 4070/or just cause it heard me saying how much of a nightmare I've had with that brand and decided to self harm, either or...

Good thing is it'll make a great video when me and my mate smash it to bits/throw it out of a window... :D :p

Due to this rudeness I've put an old but still faithful EVGA RTX 660 2GB I had in that 2nd ssf rig as it's replacement to annoy the rx580!
It was only a temp card to play csgo on whilst I was waiting for parts of the new build to come back in stock, but seems to do anything I need it to do and Linux Mint likes it unlike the rx580... Go figure.

My mates Powercolor PCS R9 390 managed to survive being shoehorned into a mini-Itx case which had an SFX power supply. It was replaced by a B-Grade Sapphire Rx5700xt because ocuk were selling them off cheap last year. I got an RX5600XT for £99.99 last year from them.

But it appears Nvidia screwed over its best AIB partner in EVGA, like they did in the past with BFG.
 
Last edited:
My mates Powercolor PCS R9 390 managed to survive being shoehorned into a mini-Itx case which had an SFX power supply. It was replaced by a B-Grade Sapphire Rx5700xt because ocuk were selling them off cheap last year. I got an RX5600XT for £99.99 last year from them.

But it appears Nvidia screwed over its best AIB partner in EVGA, like they did in the past with BFG.
Nioce!

Mine stunk of legit electrical fire smells when I opened the case after shutting it off, it was in a SG-13 Silverstone with a SFF fractal psu. The old faithful gtx is doing the job nicely now though, speaking to you using that rig as we speak haha!
 
Last edited:
i've noticed I use around 1-2gb less than any rival or higher tier amd card

Going back to this point, did the equivalent AMD card you refer to have the same or more VRAM? I only mention this because in the case of the latter isn't more of the VRAM used purely because it's available, rather than because it's needed?

I could be wrong but I'm sure I remember seeing this behaviour in some articles, perhaps when the 3090 was being compared to the 3080.

I'm pretty sure Nvidia cards do generally use slightly less VRAM than their AMD counterparts IIRC though, better compression methods perhaps? I'm leaving for a holiday tomorrow and hence am being too lazy to search for this info, feel free to indulge/ignore me! :p
 
Last edited:
Going back to this point, did the equivalent AMD card you refer to have the same or more VRAM? I only mention this because in the case of the latter isn't more of the VRAM used purely because it's available, rather than because it's needed?

I could be wrong but I'm sure I remember seeing this behaviour in some articles, perhaps when the 3090 was being compared to the 3080.

I'm pretty sure Nvidia cards do generally use slightly less VRAM than their AMD counterparts IIRC though, better compression methods perhaps? I'm leaving for a holiday tomorrow and hence am being too lazy to search for this info, feel free to indulge/ignore me! :p
Yeah, more vram across each of the amd rival/higher tier cards...

This is the interesting thing though, sure the game allocates more vram 'incase' it's needed/cause it's there... Fair enough... However this was the actual usage when gaming I'm referring to, so both cards are way under their allocated/required amount that the games settings calculate... But mine by miles compared to what the amd cards are using at the same res/settings! And the same goes for the calculated amount it reckons either card will use! Interesting eh.
 
Last edited:
Back
Top Bottom