• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7800XT is going to be a 5-year card!

As someone that favours AMD usually but has both brand gpu's in his current 2 rigs this isn't true, the 6950XT can easily reach 90C+ whilst having insane coil wine - literally look at the angry feedback by customers who have bought them on retailers feedback sections, the Sapphire 7900xt and xtx have been known to get hot and both suffer coil wine even if you buy into the more expensive mid tier of the 3 price options, as for the 4070, mine never even turns it's fan on as it won't ever get to 65c as it undervolts like crazy and even stock it never came on even with the dual bios set to performance mode on the switch at native 1440p max settings in games... It's also worth noting that when side by side with my mates 6800xt when playing the same game at the same native res/settings his 6800xt was using 11.5-12gb vram out of it's 16gb whilst my 4070 was using 9.3-9.7gb or quite often less...

I'll repeat my point this is native res so no cheating with dlss, so TLDR both cards had the same actual vram remaining free at all times regardless of my 4070 having 4gb less out the box, so games definitely are optimized better for nvidia, wether there's sneaky compression doing this or whatever the case, if you can't see it and it doesn't effect actual gameplay/input response, who cares, both cards will last the same time, but then with dlss3.X onwards and frame gen, the 4070 should last a bit longer/if not guaranty it will when it gets to the point it needs to be enabled to maintain settings/resolution...

And FWIW this test was done on pretty much every game out we both had bar BG3/Diablo4 (as neither of us care for either care), this was done with actual vram usage stats displayed and not based on what the game allocates as that means nothing compared to actual use on the fly...

Oh and you could get a 4070 for £520 last month elsewhere from ocuk (rules state I can't name names) and that was an Asus 1 with 3 year warranty, i.e. my exact card, so that's not much more than a 7800xt and cheaper than a 6800xt was when new, hardly a rip off, hence why I thought I'd try it over the coil wine/furnace spec 6950xt that uses 150w peak more than a 6800xt... I only game at 1440p and like having a cool temp room with a silent system that even including the monitor/speakers uses less than a 6800xt/6950xt would on it's own!

With the money I've saved in 2-3 years on electric (playing 6 hours every night at a horrific KW/H) from not adding an extra 320-350w tdp of a 6950xt to my system+monitor running costs, the 4070 has literally cost me nothing in year 3, vs throwing the same money down the drain with a wattage drinking current or last gen amd card, all whilst I don't have to suffer coil wine/sitting next to a furnace/loud fans. FWIW my ENTIRE SYSTEM uses 260-300w at the wall including the monitor/speakers! That's the same as just powering a 6950xt

So I end up with a FREE gpu in 3 years AND £520 back from the money i'd of spent on extra electric to add a 6800xt/6950/7900xt/xtx to my current setup... Then fit the 4070 to the 2nd rig or sell it. Use the bonus elecy money towards a new gpu. Free GPU for 3 years. No brainer.

So yes both teams have dropped the ball in their own ways.

I guess that's the difference with believing clickbait videos/graphs online vs actually trying something yourself and returning it no questions asked under the 14 day no quibble everyone offers, after all why would I let some muppet on youtube deligate my money/opinion when I can send it back next day delivery and only loose what £5-8 in returns postage, and an hour of my time trying out the gpu with my games/installing a driver?

All of that is at best subjective, this thing runs at over 70c, while that doesn't seem too bad bare in mind that its fans are running at over 1800 RPM, its over 36db, one of the loudest GPU's you can get these days, its £560.

Coil wine as nothing to do with the GPU, quite often that's actually caused by the PSU, not the GPU but even when it is that's the coils making that noise, nothing to do with the GPU, all cards be it AMD or Nvidia can suffer coil wine, there are plenty of Nvidia examples that have it and plenty of AMD examples that don't.

This is the cheapest RX 7700XT, like the £560 4070 i linked its got an inexpensive dual fan cooler, its £430, 90% the performance of the 4070 and overclocks 18% 'performance' according to TPU (which OC vs OC makes it as fast if not faster than the 4070) 61c at 26.3db, a huge difference. Its a far higher quality product despite being £130 cheaper. that 4070 is cheap junk for silly money, no one should be buying it, isn't that what tech jurnoes like to say these days?
 
Last edited:
All of that is at best subjective, this thing runs at over 70c, while that doesn't seem too bad bare in mind that fans are running at over 1800 RPM, its over 36db, one of the loudest GPU's you can get these days, its £560.

Coil wine as nothing to do with the GPU, quite often that's actually caused by the PSU, not the GPU but even when it is that's the coils making that noise, nothing to do with the GPU, all cards be it AMD or Nvidia can suffer coil wine, there are plenty of Nvidia examples that have it and plenty of AMD examples that don't.

This is the cheapest RX 7700XT, like the £560 4070 i linked its got an inexpensive dual fan cooler, its £430, 90% the performance of the 4070 and overclocks 18% according to TPU (which OC vs OC makes it as fast if not faster than the 4070) 61c at 26.3db, a huge difference. Its a far higher quality product despite being £120 cheaper. that 4070 is cheap junk for silly money.
It's not subjective when we literally tested rigs side by side at the same settings, then I worked out at 6 hours daily for a year at my KW/H it costs me £192 a year extra to run a 6950xt or 7900xt/xtx that I was going to get originally, so not only did I save nearly £300 on my gpu and then change from a 5800x3d to a 5700x and undervolt both, but I came out with -£600 under budget, bought a 31.5" new monitor, a 2nd MITX rig, and had some change, then add the 3 years of have a free gpu then put it in your 2nd rig or sell it!

That is fact that I WILL have a free gpu for the next 3 years + £520 or more from saving in my electric bills towards whatever 2023 gpu is out, which a 320-380-450w 6950-7900xt/xtx uses wont give me back, vs speculation!

how can I loose, all whilst getting 6800xt or higher performance , with the future proofing of frame gen/dlss3.X onwards and super low latency mode etc etc vs buying a rebadged 3 year old card and saving a measly £40... Pointless!

When mine slows down I just use DLSS/FG, having tested both being an AMD guy, I was genuinely shocked how I couldn't tell vs FSR which is quite frankly laughable, and we all know how biased/influenced/sponsored the devs are of games towards nvidia, so I know for a fact mine will be the better supported in the next 3 years.

As for coil wine, I think you are speculating.

I don't care that you're being biased choosing an overpriced rubbish fan/cooling 4070, you are clearly not listening, I said last month MY card an Asus Dual 4070 in white was £520 from a rival supplier, that is not a badly main card/cooler, and has 3 years warranty, dual bios and it never goes above 57-63c and the stock fan is 65c to engage. The ONLY game that's made it come on for a few seconds was Control with RT on at 1440p max settings at 66C then it span down seconds later...

A 7700XT cannot do RT/DLSS/Frame Gen so no it isn't 90% of the featureset nor the performance, you are living in a dream world and IS the rip off card if we're talking about pricing.

A 6700XT is good value at £280-300 but a 7700xt for £90 less than a 4070 clearly isn't. Nor is a 7800xt with 3 year old 6800xt performance for what £120 less than the 6800xt rrp? Pot kettle much!
 
Last edited:
Even now a month later my gpu is still cheaper than your linked one (and as I mentioned above if you'd bought mine in white last month it was £520), and if you choose one of the many other brands it's still only £529.99 thus £40 less than a Sapphire 7800xt that isn't in stock at the price listed...

As of right now... The MSI/Gigabyte/Zotac/Asus have 3 year warranty as well... And aren't junk brands/coolers... So tell me why these are rubbish vs a 3 year old card rebadged for a rip off price that doesn't have RDNA4 nor RT ability nor DLSS3.5/Frame generation/better optimization in real world game use.
4070-cheap.png


I would have bought it regardless of it being AMD/Nvidia, as I've said I'm usually more an AMD guy by preference, but I couldn't argue with the performance/future proofing for the next 2-3 years per watt/per pound, nor the fact I get my money back in 2-3 years from electric savings. This is the difference between trying something 'different' out vs being biased. At worst I'd have returned it and ran back to AMD, big deal for a big of returns postage/time...

So do elaborate how my card with it's future proofed bonus features/actual vram usage per fps/fidelity is 'a rip off/rubbish' when it runs on air, is silent, doesn't turn my room into a sauna and gives me a refund in 2-3 years from electric saved because it's an undervolting monster, all whilst being able to still sell/keep the 4070 for extra £££! I Think I'll be the one laughing!
 
Last edited:
It's not subjective when we literally tested rigs side by side at the same settings, then I worked out at 6 hours daily for a year at my KW/H it costs me £192 a year extra to run a 6950xt or 7900xt/xtx that I was going to get originally, so not only did I save nearly £300 on my gpu and then change from a 5800x3d to a 5700x and undervolt both, but I came out with -£600 under budget, bought a 31.5" new monitor, a 2nd MITX rig, and had some change, then add the 3 years of have a free gpu then put it in your 2nd rig or sell it!

That is fact that I WILL have a free gpu for the next 3 years + £520 or more from saving in my electric bills towards whatever 2023 gpu is out, which a 320-380-450w 6950-7900xt/xtx uses wont give me back, vs speculation!

how can I loose, all whilst getting 6800xt or higher performance , with the future proofing of frame gen/dlss3.X onwards and super low latency mode etc etc vs buying a rebadged 3 year old card and saving a measly £40... Pointless!

When mine slows down I just use DLSS/FG, having tested both being an AMD guy, I was genuinely shocked how I couldn't tell vs FSR which is quite frankly laughable, and we all know how biased/influenced/sponsored the devs are of games towards nvidia, so I know for a fact mine will be the better supported in the next 3 years.

As for coil wine, I think you are speculating, you are clearly not listening, I said last month MY card an Asus Dual 4070 in white was £520 from a rival supplier, that is not a badly main card/cooler, and has 3 years warranty, dual bios and it never goes above 57-63c and the stock fan is 65c to engage. The ONLY game that's made it come on for a few seconds was Control with RT on at 1440p max settings at 66C then it span down seconds later...
A 7700XT cannot do RT/DLSS/Frame Gen so no it isn't 90% of the featureset nor the performance, you are living in a dream world and IS the rip off card if we're talking about pricing.

A 6700XT is good value at £280-300 but a 7700xt for £90 less than a 4070 clearly isn't. Nor is a 7800xt with 3 year old 6800xt performance for what £120 less than the 6800xt rrp? Pot kettle much!
Maybe if you had looked harder you would have found one for 50p, i'm not interested in one hour gone the next deals or smaller retailers trying to get noticed selling at below cost, what's good for one is good for the other, if i work hard enough i'm sure i can find anything 10 or 15% cheaper than you find it almost anywhere else, if i do i'm not going to use it to imply the person i'm debating is disingenuous, that would be the same strawmanning "but you can sell Starfield so its actually only £390" Not that i would need to because i have seen them at £390 RP and still include Starfield, so what's that now if you sell Starfield, £350? No thanks to circular arguments.

Power consumption is another strawman argument that only ever comes up when Nvidia are winning that argument, 7700XT vs 4070 30 watts more power at 27p PKWH you would have to play games for 3 hours per day every day for 6 years just to make back the cost difference.

Better supported how? Where is Nvidia's big Starfield driver? Its been 3 weeks. Maybe they have AI making that driver?

On price to performance vs last gen, do you want me to detail the 4080, the 4070Ti, the 4060Ti, 16GB lol, the 4060, even the 4070?
 
Last edited:
Even now a month later my gpu is still cheaper than your linked one (and as I mentioned above if you'd bought mine in white last month it was £520), and if you choose one of the many other brands it's still only £529.99 thus £40 less than a Sapphire 7800xt that isn't in stock at the price listed...

As of right now... The MSI/Gigabyte/Zotac/Asus have 3 year warranty as well... And aren't junk brands/coolers... So tell me why these are rubbish vs a 3 year old card rebadged for a rip off price that doesn't have RDNA4 nor RT ability nor DLSS3.5/Frame generation/better optimization in real world game use.
4070-cheap.png


I would have bought it regardless of it being AMD/Nvidia, as I've said I'm usually more an AMD guy by preference, but I couldn't argue with the performance/future proofing for the next 2-3 years per watt/per pound, nor the fact I get my money back in 2-3 years from electric savings. This is the difference between trying something 'different' out vs being biased. At worst I'd have returned it and ran back to AMD, big deal for a big of returns postage/time...

So do elaborate how my card with it's future proofed bonus features/actual vram usage per fps/fidelity is 'a rip off/rubbish' when it runs on air, is silent, doesn't turn my room into a sauna and gives me a refund in 2-3 years from electric saved because it's an undervolting monster, all whilst being able to still sell/keep the 4070 for extra £££! I Think I'll be the one laughing!

Only two out of that lot i would buy are the Asus and the Gigabyte, for £460 and not a penny more.
 
Maybe if you had looked harder you would have found one for 50p, i'm not interested in one hour gone the next deals or smaller retailers trying to get noticed selling at below cost, what's good for one is good for the other, if i work hard enough i'm sure i can find anything 10 or 15% cheaper than you find it almost anywhere else, if i do i'm not going to use it to imply the person i'm debating is disingenuous, that would be the same strawmanning "but you can sell Starfield so its actually only £390, not that i would need to because i have seen them at £390 RP and still include Starfield, so what's that now if you sell Starfield, £350? No thanks to your circular arguments.

Power consumption is another strawman argument that only ever comes up when Nvidia are winning that argument, 7700XT vs 4070 30 watts more power at 27p PKWH you would have to play games for 3 hours per day every day for 6 years just to make back the cost difference.

Better supported how? Where is Nvidia's big Starfield driver? Its been 3 weeks.

On price to performance vs last gen, do you want me to detail the 4080, the 4070Ti, the 4060Ti, 16GB lol, the 4060, even the 4070?
That doesn't make any sense?
It was £520 for my more expensive money last month if you bought it in white, and even now a month + later it's still LESS than the £560 'rubbish fan' version you linked? So how's it expensive/a rip off if it's £40 more than a 6800xt in a dress?

The £530 in stock cards make that even cheaper thus defeating the 'rip off' status even more so?

That's complete biased nonsense, the only reason I know about power consumption is because I've used it on my AMD gpu/cpu's? I don't get what Nvidia has to do with that? It works on either?

It's clearly valid if it saves me the entire price of my gpu in it's ownership! Thus FREE? And I can still sell it in the future? How is that a bad thing?

I have zero interest in last gen, I don't upgrade every year, I had a rx580 8gb and wanted originally a 6600xt then a 6700/6800xt last year, then considered a 3070ti - didn't want to pay the £875 for the 3070Ti, didn't like how much the 6800xt/6900/6950xt's ate in electric, settled on a 7900xt this year, found out they have coil wine/high rma rate at first/issues mentioned on many forums even by sapphire owners/good brands/eat electric/are like sitting next to a furnace and fall over doing RT and cost £2-300 at the time of having the money to buy my 4070... The final nail in the coffin was 6950xt's only offering A YEAR's warranty! That is scummy as, and all the Sapphire 7900xt's that cost £60-£100 more than the ref card (scene tax MUCH) had a pathetic 2 year warranty for something 'top tier'... All whilst adding 250-350w more than the proven undervolt monster that is my 4070 per month in wall draw, thus 6 hours x 365 x my KW/H = £192 a year MORE to power those gpus... No thanks!
 
Last edited:
Only two out of that lot i would buy are the Asus and the Gigabyte, for £460 and not a penny more.
Yet you'd spend £20 more than that £460 on a 2 year warranty 3 year old 6800xt in a dress 7800xt... And you're talking about value/feature-set per pound? I have the asus. It was £520 last month if you used the same supplier, and bought it in white, even now it's still cheaper than the rubbish fan one you linked earlier. The gigabyte/msi are both 3 year warranty too, and have been proven reliable on that card by owners feedback.

The only reason I back this 4070 is because it's like a modern 6600xt i.e. hell of a bang performance wise for it's wattage/physical size and fits in my MITX when I'm done with it in the main rig, I'ma AMD guy by choice but I wasn't scared to try something different.

I'd bet 99% of the hate for this card wouldn't exist if AMD had made it...
 
Last edited:
Yet you'd spend £20 more than that £460 on a 2 year warranty 3 year old 6800xt in a dress 7800xt... And you're talking about value/feature-set per pound? I have the asus. It was £520 last month if you used the same supplier, and bought it in white, even now it's still cheaper than the rubbish fan one you linked earlier. The gigabyte/msi are both 3 year warranty too, and have been proven reliable on that card by owners feedback.

The only reason I back this 4070 is because it's like a modern 6600xt i.e. hell of a bang performance wise for it's wattage/physical size and fits in my MITX when I'm done with it in the main rig, I'ma AMD guy by choice but I wasn't scared to try something different.

I'd bet 99% of the hate for this card wouldn't exist if AMD had made it...

I wouldn't buy anything with a 2 yr warranty and i'm not going to. But i would sooner buy a 7800XT over the 4070 at the same money or a 7700XT for less.

The 4070 is worth more to me than the 7700XT but to use OCUK examples not £80 more.
 
Last edited:
I wouldn't buy anything with a 2 yr warranty and i'm not going to. But i would sooner buy a 7800XT over the 4070 at the same money or a 7700XT for less.

The 4070 is worth more to me than the 7700XT but to use OCUK examples not £70 more.
No neither would I, that's what I'm trying to say, and at the time I thought the sneaky 1 minute the 7900xt was £739.99 then back to £839.99 then £900 and back and forth for a 2 year card was very VERY scummy regardless of who sold it, it was the same everywhere I looked!

I think for a 2nd from the top premium product like that that's nearly a grand, that isn't acceptable!

I think for £520 for a brand I trust (asus) and a 3 year warranty handled with a no questions asked, just send it back we'll deal with the rest, paired with it costing me nothing in 3 years having saved £192 EACH year playing it 6 hours a day every evening, is a no brainer, I mean it's literally free, and I can still sell it/put it in my 2nd rig if I want after?)

I think just having things like FG/DLSS3.5X/RT makes it worth it, I mean I literally can run say CP at 72fps max settings in a savage area native 1440p, or get the same WITH RT on with DLSS3.5/FG/RT on psycho? Or have 120-130fps without RT on with FG/DLSS/Super Low Latency on? Nice future proofing when required.

TBH bar Starfield, I've had no reason to use DLSS, and I could have just settled for 58-65 fps native 1440p max settings if it'd been bothered... But having 88-110-120fps depending on area is a no brainer when you can't tell it's on?
 
Its overpriced...
It's £40 more than a 3 year old 6800xt in a dress 7800XT, with more features/futureproofing/runs on air/is silent/fans never kick in/undervolts like a beast/doesn't turn your room into a sauna?

Supports all the latest features and doesn't cripple the card when using them? What's not to like, IDC personally what did what last year as I didn't own last gen. Nor do I care about the naming structure as that's neither here or there, as we can play that card with saying a 7900xt is just a 7800xt and a 7700xt is a 7600xt, if you wanna split hairs...
 
Because you probably don't know.

My GPU history over the last 8 or 9 years:

GTX 970
GTX 1070
RTX 2070S

Its time to upgrade. i've had the 2070S for the longest because i just couldn't bring myself to buy the 3070, i just thought it was crap for the money, wasn't too keen on AMD's offering's either.

But Nvidia have done it again, AMD not too bad, not great but ok.

I don't care about DLSS despite people telling me "its better than native" i just can't help but see for my self that it clearly isn't, as for RT in this range, 25 FPS on an AMD card or 35 FPS on Nvidia its just #### or a bit less #### but still ####!
30 watts more power.... meh so what?
I much more prefer AMD drivers, better UI and better features.

So with the prices as they currently are, unless things change drastically soon my love affair with Nvidia is going to end, i'm not paying extra for something i don't use or for slightly less #### RT and a driver stack from Windows 98 SE, yes i'm that old.
 
Last edited:
Because you probably don't know.

My GPU history over the last 8 or 9 years:

GTX 970
GTX 1070
RTX 2070S

Its time to upgrade. i've had the 2070S for the longest because i just couldn't bring myself to buy the 3070, i just thought it was crap for the money, wasn't too keen on AMD's offering's either.

But Nvidia have done it again, AMD not too bad, not great but ok.

I don't care about DLSS despite people telling me "its better than native" i just can't help but see for my self that it clearly isn't, as for RT in this range, 25 FPS on an AMD card or 35 FPS on Nvidia its just #### or a bit less #### but still ####!
30 watts more power.... meh so what?
I much more prefer AMD drivers, better UI and better features.

So with the prices as they currently are, unless things change drastically soon my love affair with Nvidia is going to end, i'm not paying extra for something i don't use or for slightly less #### RT and a driver stack from Windows 98 SE, yes i'm that old.
OK, I didn't know that, I can see why you want to squeeze the bang per buck having had a 2070, DLSS wise, for me who has gone from a RX580 8GB to this 4070, who wasn't a believer in it DLSS, due to having tried awful FSR1/2, I was genuinely impressed I couldn't tell it was on with 3.5 and the added FG future proofing of both is a no brainer + be able to enjoy RT without giving the card a stroke (in my situation) IS worth it. As yes it's nice to have a backup feature like DLSS, but to have one where you can literally trick your mates and say which is native which isn't, coming from an AMD for all his life guy like me, was very impressive (bar if nvidia was inbuilt gpu in say my macbook pro's)

If you look at the fact I don't even have to have RT on in CP natively at max 1440p to achieve 70s in intense areas or 80s in less areas with Psycho on, I think that's decent for a bonus feature, the fact I can with DLSS3.5/FG gain another 30-50fps with RT on, from a future proofing point, is a valid reason to own this for me.

As for wattage, 30 watts you're kidding me, a 4070 does 200w max even at native 4k or 4k dlls/rt/etc etc... Mine undervolts to 105-125w in every game, even if overclocked, or less stock and just undervolted, all measured at 1440p max settings native or 134-145w with RT on native 1440p in Control or Resi4 Remake without dlss... FWIW it's mostly living it's life at 105-115w all the time, which is a hell of a difference vs a 320-360+w AMD card of last or current generations...

For me I wanted brand new, I wanted a 3 year warranty, I wanted modern feature-sets/future proofing, I wanted this gen's features, so wether it was X faster than last gen, wouldn't bother me as much if it meant I got the future proofing headroom the previous gen didn't feature - as remember I was going to get either a 6800xt/3070ti last year - glad I didn't!

Put it this way, my entire setup at the wall is 260-300w, that's with a 31.5" 165hz 1440p monitor, an amp with speakers... Now simply put a 6950-7900xt does use more than that on it's own... Thus if you add another 250w to my 105-115w average UV'ed 4070, that'd make over the course of a year £192 MORE wasted in electric bill, EVERY year that I own it per KW/H I pay... That's disgusting, but I like to play 6 hours or more of an evening, as I only built it for gaming.

So in my situation bang per buck, this setup cannot be beaten, I even purposely chose this 5700x over an X3D for the same reasons, and I've gained going from a 27" 1080p 144hz to native 1440p 165hz gaming and literally half the electric cost, I really did not like the idea of my money costing me double the money every year I use it, after all stuff looses half it's value nearly every year as we know, so to get my gpu basically refunded in year 3 and either sell it for extra £££ towards a 2026 GPU or put it in my MITX 2nd rig (which I know it'll fit) was a no brainer in my situation.

However the rival cards were not.

I'm glad we both now know each others situations/decisions.
 
Last edited:
@Removed User 345456 as an "AMD guy" you should know about how AMD pushes their cards way beyond their sweet spot on the efficiency curve and how to easily remedy it. We can talk all day long about how silly it is, and I would be inclined to agree with that point of view, but nevertheless it's just the fact of the matter. My 6950XT Red Devil has horrible coil whine on multiple PSU's(Gold/plat) at stock and sucks a "healthy" 330+ watts at the wall. What is funny is if you downclock the gpu by 10% you don't loose 10% performance, more like 2-3% and the power draw drops to around 250-270 watts and all coil whine is basically gone and temps are MUCH improved. The reason for the 6000 series and now the 7000 power draw and heat output(we saw the same with vega btw) is because AMD is being AMD. It's still a tinkers brand in the upper mid range to high end bracket. If you know what your dealing with you can get incredible performance/watt and performance/dollar but if you don't care or don't know then your leaving a lot on the table and the end result is a bit lackluster at times.

Currently running my 6950XT downclocked by 18%, no UV as it's a dud in that regard, powerdraw is down 39%, performance loss is roughly 10%, and junction/hotspot temperature is down 30% and while I haven't measure fan speed it is also down by a lot, the card makes almost no sound anymore. The pump in my arctic freezer 240 is audible now, annoying me to no end :p.
 
Last edited:
@Removed User 345456 as an "AMD guy" you should know about how AMD pushes their cards way beyond their sweet spot on the efficiency curve and how to easily remedy it. We can talk all day long about how silly it is, and I would be inclined to agree with that point of view, but nevertheless it's just the fact of the matter. My 6950XT Red Devil has horrible coil whine on multiple PSU's(Gold/plat) at stock and sucks a "healthy" 330+ watts at the wall. What is funny is if you downclock the gpu by 10% you don't loose 10% performance, more like 2-3% and the power draw drops to around 250-270 watts and all coil whine is basically gone and temps are MUCH improved. The reason for the 6000 series and now the 7000 power draw and heat output(we saw the same with vega btw) is because AMD is being AMD. It's still a tinkers brand in the upper mid range to high end bracket. If you know what your dealing with you can get incredible performance/watt and performance/dollar but if you don't care or don't know then your leaving a lot on the table and the end result is a bit lackluster at times.

Currently running my 6950XT downclocked by 18%, no UV as it's a dud in that regard, powerdraw is down 39%, performance loss is roughly 10%, and junction/hotspot temperature is down 30% and while I haven't measure fan speed it is also down by a lot, the card makes almost no sound anymore. The pump in my arctic freezer 240 is audible now, annoying me to no end :p.
Yeah man, yeah, it's whatever works for the individual and they're needs, I just personally hate wasting money for the sake of it, it's bad enough with cars I play with them and have they run off 99octane at smiles per gallon vs miles haha, so having a pc also do that whilst rapidly loosing it's value per month is pointless to me.

I think at that point to loose 18% of your clocks and 10% performance is pretty nuts if you're buying a card that serious/may as well own X card with that bang per buck at those settings, when owners of the 6950/7900xt were telling me they were using 135w or more when undervolted in 1080p I was pretty shocked, as a stock 4070 uses around 105-115w at 1080p ultra in tlou for example. And lets face it 1080p for a 6950/7900xt/xtx isnt exactly push it, is it.

Also the newer games get you will find you'll eventually push that card right back upto stock power limits etc in order to achieve what you need, where as mine for example will never go above it's 200w, but it won't anyway cause it undervolts so well - not a dig, just was a major factor.

Yeah the heat/coilwine/temps on a stock 6950xt are nuts, they're upto 150w peak worse on wattage than a 6900xt if you're really pushing them when stock.

I enjoyed getting my RX580 8GB down to 76-96w, but it required -200mhz game clock and -50 power limit, sigh. My ASUS 4070 out the box is something crazy like 350mhz more game clock than the Ref Nvidia card and happily does 3/3.1ghz (watch ocuk's own 8pack review/clock it) and 500mhz oc on the memory at 3-3.1ghz game clocks, I just personally run it IIRC 80mhz higher than the ref Nvidia clocks but at a ridiculous undervolt, to which IF I want to I can clock the memory upto a 1ghz more, which I found was stable for 6-8 hours playing RDR2 and other games, I just don't feel it 'needs it' ATM so just run the 'stock' style game clock with stock memory speeds undervolted to the gills, I find it mad I can play 90% of native 1440p games at ultra settings at 105-115w, occasionally pushing to 125-135w, all in glorious silence with a basic bish case/3 cheap £9.80 a pop BeQuiet fans running 40% or less IIRC, whilst the rig sits next to my leg and isn't a furnace, oh and for extra scene points, this is next to a bleedin' airing cupboard, so the build is a testament to temps/bang per buck/silence.
 
Yeah man, yeah, it's whatever works for the individual and they're needs, I just personally hate wasting money for the sake of it, it's bad enough with cars I play with them and have they run off 99octane at smiles per gallon vs miles haha, so having a pc also do that whilst rapidly loosing it's value per month is pointless to me.

I think at that point to loose 18% of your clocks and 10% performance is pretty nuts if you're buying a card that serious/may as well own X card with that bang per buck at those settings, when owners of the 6950/7900xt were telling me they were using 135w or more when undervolted in 1080p I was pretty shocked, as a stock 4070 uses around 105-115w at 1080p ultra in tlou for example. And lets face it 1080p for a 6950/7900xt/xtx isnt exactly push it, is it.

Also the newer games get you will find you'll eventually push that card right back upto stock power limits etc in order to achieve what you need, where as mine for example will never go above it's 200w, but it won't anyway cause it undervolts so well - not a dig, just was a major factor.

Yeah the heat/coilwine/temps on a stock 6950xt are nuts, they're upto 150w peak worse on wattage than a 6900xt if you're really pushing them when stock.

I enjoyed getting my RX580 8GB down to 76-96w, but it required -200mhz game clock and -50 power limit, sigh. My ASUS 4070 out the box is something crazy like 350mhz more game clock than the Ref Nvidia card and happily does 3/3.1ghz (watch ocuk's own 8pack review/clock it) and 500mhz oc on the memory at 3-3.1ghz game clocks, I just personally run it IIRC 80mhz higher than the ref Nvidia clocks but at a ridiculous undervolt, to which IF I want to I can clock the memory upto a 1ghz more, which I found was stable for 6-8 hours playing RDR2 and other games, I just don't feel it 'needs it' ATM so just run the 'stock' style game clock with stock memory speeds undervolted to the gills, I find it mad I can play 90% of native 1440p games at ultra settings at 105-115w, occasionally pushing to 125-135w, all in glorious silence with a basic bish case/3 cheap £9.80 a pop BeQuiet fans running 40% or less IIRC, whilst the rig sits next to my leg and isn't a furnace, oh and for extra scene points, this is next to a bleedin' airing cupboard, so the build is a testament to temps/bang per buck/silence.
Well price certainly comes into the equation. The current 6950XT prices makes it non purchases IMHO and I'm very happy with my purchase. I however did not pay the current price for a 6950XT. I got it for the price of the cheapest and poorest 4070 available in my country(DK). To me, playing at UW 1440P, memory and bandwidth is important and due to this alone the 4070 just falls flat on its face. By the time the current settings I'm running isn't enough anymore the full stock settings wont be either as the performance loss has been so small. Clocks doesn't matter, its the end result that matters. If it's 60fps or 55 is the same to me, a non starter :). Currently playing CP2077 near max settings. Benchmark is giving me 118 with FSR(rig in sig) on a 5800x3D with no SMT so just 8 pure cores and power locked on linux. I think I'm well off for the foreseeable future. But you are correct in "different needs and priorities for different people".

I'm also a little bit of a car enthusiast myself(novice in knowledge) but it's difficult here in Denmark as the general public hates cars(anything with a combustion engine) and our politicians makes laws almost every year that makes it further difficult to nurture that hobby outside of a race track(and even on it at times).
 
I've recently watched a youtube opinion on how 7800xt is such a good card, it will last 5 years.

I totally disagree.
Let's disregard the marketing BS ("7800-XT vs 7800"): If the 6800xt, a THREE year old card, costs a little less than the 7800xt and trades blows with it to the point where there's doubt if 7800XT is a better choice.. Then 7800xt is a very dubious release. Unless drivers and general software shift give legs to 7800XT, then it's a failed release.

The 7800XT is going to be your 5-year card because AMD just gave up on RDNA-4. RDNA-4 has development challenges and will be released with stuff turned off, in lower-end SKUs.

AMD is focusing on RDNA-5.
They've decided to allocate TSMC wafers to their much more profitable FPGA and GPGPU products.
Nvidia is da-facto taking the same approach.

In an indirect way, I am OK with this.
Hopefully, if GPU power isn't going to increase, developers will focus on optimising releases and focusing on gameplay rather than "Draw reflective nose hair for NPCs", "Volumetric farts" and other graphics tacky-ness which is already over the line, if you ask me.
5 year card? It's terrible at RT, so not even good enough to consider now IMO (appreciate some don't care about RT though - as some don't care about > 1080P, each to his own)
 
Well price certainly comes into the equation. The current 6950XT prices makes it non purchases IMHO and I'm very happy with my purchase. I however did not pay the current price for a 6950XT. I got it for the price of the cheapest and poorest 4070 available in my country(DK). To me, playing at UW 1440P, memory and bandwidth is important and due to this alone the 4070 just falls flat on its face. By the time the current settings I'm running isn't enough anymore the full stock settings wont be either as the performance loss has been so small. Clocks doesn't matter, its the end result that matters. If it's 60fps or 55 is the same to me, a non starter :). Currently playing CP2077 near max settings. Benchmark is giving me 118 with FSR(rig in sig) on a 5800x3D with no SMT so just 8 pure cores and power locked on linux. I think I'm well off for the foreseeable future. But you are correct in "different needs and priorities for different people".

I'm also a little bit of a car enthusiast myself(novice in knowledge) but it's difficult here in Denmark as the general public hates cars(anything with a combustion engine) and our politicians makes laws almost every year that makes it further difficult to nurture that hobby outside of a race track(and even on it at times).
Fair enough mate :)
I think to get 110-130 with DLSS with just a 5700x set in eco mode with a 4070 at 1440 ultra isn't bad is it, or RT on psycho at 70-90 depending on the area. It'd be interesting to see what a jump I'd get with a x3d, but I purposely chose a 5700x to run it in it's 'eco' mode vs the higher tdp (105 iirc) of a x3d along with the x3d being VERY toasty, and I wanted a silent/cool rig where it lives next to an airing cupboard as it is, and I'm not one for watercooling nor it's additional costs haha!

Ah cool man, love cars myself, older stuff with souls/old school motorsport/drivers. Shame that it's dying in some countries.
 
Back
Top Bottom