• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Both the 4080 and 7900XTX should have been a similar price to the 3080 and 6800XT was last gen, not sure why when Nvidia raises their price by £550 that AMD can raise theirs by £400 and get praised because they are £150 cheaper.

Really? it's beacuse it's a better card and much cheaper. Not sure what you're missing.

No-one is praising anyone for price hikes gen on gen that's a straw man.

I think what you don't like is while both has raised prices beyond what they should have been, NV's prices have increased more than AMD's and their cards are worse. There's clear seperation between the two here, so simply saying "they both bad" isn't a very convincing argument when NV is objectively and factually worse this gen than AMD.

You can get a faster cheaper card all they way down the stack from AMD, oh and of course in many cases more VRAM.

:)
 
Does it bother you that much that people don't approve of AMD's actions, or that many think their lower tier cards are poorly priced? Again, just because Nvidia's cost more, it doesn't make AMD a shining beacon that we should all praise.
You should be happy, as there's no favouritism at play here. People are saying they're both bad. I've owned AMD cards before, and will again, but the effects of the AMD koolaid wore off on me a few years after Barton. They aren't a plucky underdog, just a giant company. :D
 
Does it bother you that much that people don't approve of AMD's actions, or that many think their lower tier cards are poorly priced? Again, just because Nvidia's cost more, it doesn't make AMD a shining beacon that we should all praise.
You should be happy, as there's no favouritism at play here. People are saying they're both bad. I've owned AMD cards before, and will again, but the effects of the AMD koolaid wore off on me a few years after Barton. They aren't a plucky underdog, just a giant company. :D

Yep they ain't your friend. Zero interest in their current offerings. They got way to greedy imo. Just because Nvidia do does not make it acceptable.
 
Does it bother you that much that people don't approve of AMD's actions, or that many think their lower tier cards are poorly priced? Again, just because Nvidia's cost more, it doesn't make AMD a shining beacon that we should all praise.
You should be happy, as there's no favouritism at play here. People are saying they're both bad. I've owned AMD cards before, and will again, but the effects of the AMD koolaid wore off on me a few years after Barton. They aren't a plucky underdog, just a giant company. :D

No-one said AMD should be praised - so that's another straw man.

I don't appove of AMD's actions either, but the facts remain. Thier cards are faster and cheaper, and therefore NV are worse this gen. People are trying to make out there's no difference between the two because "both bad" - but that's simply not true.

Faster cards for less money with more VRAM. If the cards were all neck and neck and priced equally, then they'd both be as bad as each other - but they're not.

;)
 
Don't use the Nvidia connector or a cablemod, only the PSU maker's own direct cable. Job jobbed :cool:

Oh also 80% power limit , you only lose 4fps from the total average, but shave about 100 watts of power consumption, 70% would be even better I guess but I've only gone to 80 tbh as that's good enough.
 
Last edited:
Don't use the Nvidia connector or a cablemod, only the PSU maker's own direct cable. Job jobbed :cool:

Oh also 80% power limit , you only lose 4fps from the total average, but shave about 100 watts of power consumption, 70% would be even better I guess but I've only gone to 80 tbh as that's good enough.
Gigabyte told me in an email that PSU manufacturer 12VHPWR cables could void my warranty.

The *only* cable that is warranty-safe is the ugly adapter that came in the box.

This is very telling too. The adapter has two different kind of connectors on it and no manufacture seems to care about what PSU cable you use to connect to the old-school pcie connectors on the adapter.

The 12VHPWR connector requires extra-special-super-careful-leagal-speak-attention.

The 8-pin pcie side of the adapter? Just connect whatever.

This is the response I got when I asked about using my Super Flower cable, made by Super Flower, for my Super Flower PSU:

"Dear Customer,

If this cable for some reason causes a burnout on the connector, we would have to inspect what the cause is before deciding if we’re voiding or continuing to allow warranty service.



Best Regards,"
 
Last edited:
Just Gigabyte being Gigabyte. It's the Nvidia connectors that are pictured melting (or cablemod) - Yet to see a PSU maker's direct cable (2x PCIE to 12VHPWR) melting. The Nvidia one that comes in the box is so bulky with up to 4 inlets going into the thin GPU plug always looked ridiculous to me the first time I saw it in online pics when the 4090 was unboxed on videos. Even if the numbers that end up being affected is small, it's still a number that's greater than the risk I am willing to take, and the Corsair 600W 12VHPWR direct cable they sell is just peace of mind in my eyes.
 
Gigabyte told me in an email that PSU manufacturer 12VHPWR cables could void my warranty.

The *only* cable that is warranty-safe is the ugly adapter that came in the box.

This is very telling too. The adapter has two different kind of connectors on it and no manufacture seems to care about what PSU cable you use to connect to the old-school pcie connectors on the adapter.

The 12VHPWR connector requires extra-special-super-careful-leagal-speak-attention.

The 8-pin pcie side of the adapter? Just connect whatever.

This is the response I got when I asked about using my Super Flower cable, made by Super Flower, for my Super Flower PSU:

"Dear Customer,

If this cable for some reason causes a burnout on the connector, we would have to inspect what the cause is before deciding if we’re voiding or continuing to allow warranty service.



Best Regards,"

What a joke honestly. Roll on next gen. They will have it sorted by then.
 
They have it sorted now, Intel revised the 12VHPWR spec and recessed the sense pins section. lots of people are blaming Nvidia for this, but it's Intel that came up with the specification of the actual connector.

If you have just bought a new 40 series, then the chances are it uses the revised connector spec.
 
Last edited:
Don't use the Nvidia connector or a cablemod, only the PSU maker's own direct cable. Job jobbed :cool:

Oh also 80% power limit , you only lose 4fps from the total average, but shave about 100 watts of power consumption, 70% would be even better I guess but I've only gone to 80 tbh as that's good enough.
I have an Asus Thor 850P and the cable that Asus use is CableMod. I've ordered a cable direct from CableMod which uses the new design grounding the sense pins in the cable itself, the only issue CableMod are seeing is with the adapters and not the cables. And should I have a problem I'm encouraged by the support that CableMod have given.
 
Apparently, the 4070 is getting a much needed price cut.

I might bite and look around for a used one in a couple of months as a decent stopgap since I'd effectively be getting an incredibly power-efficient single-8pin 3080 12GB with DLSS3.0 and better RT, which I find really appealing at a reasonable price.

Should be enough to drive UW and I'll happily drop down some settings along the way waiting for the 5000 series or AMD's 8000.

I mean, I can grab a 4080 soonish but do I really want to? That is the question:P
 
  • Like
Reactions: TNA
Just Gigabyte being Gigabyte. It's the Nvidia connectors that are pictured melting (or cablemod) - Yet to see a PSU maker's direct cable (2x PCIE to 12VHPWR) melting. The Nvidia one that comes in the box is so bulky with up to 4 inlets going into the thin GPU plug always looked ridiculous to me the first time I saw it in online pics when the 4090 was unboxed on videos. Even if the numbers that end up being affected is small, it's still a number that's greater than the risk I am willing to take, and the Corsair 600W 12VHPWR direct cable they sell is just peace of mind in my eyes.
I agree that the adapter is not as good as some other options. However, using it makes whatever issues I have Gigabyte's problem for 4 years from the date of purchase. I can tolerate the ugly adapter for 4 years of warranty.

For anyone looking to buy a 4090, I would get the Founders edition (GN got a public commitment from Nvidia that they would honor the warranty regardless of cable used) or email the manufacturers y9u are interested in and ask them up front what their policy is.

-Or just throw the warranty out the window, use whatever cable you like, and YOLO it.
 
Last edited:
Apparently, the 4070 is getting a much needed price cut.

I might bite and look around for a used one in a couple of months as a decent stopgap since I'd effectively be getting an incredibly power-efficient single-8pin 3080 12GB with DLSS3.0 and better RT, which I find really appealing at a reasonable price.

Should be enough to drive UW and I'll happily drop down some settings along the way waiting for the 5000 series or AMD's 8000.

I mean, I can grab a 4080 soonish but do I really want to? That is the question:p

Don't do it.gif

First option makes A LOT more sense imo.
 
Only 4070 I would look at would be the ti model, the non ti model is awful, doesn't it perform worse than a 3080 in some games?

If 4080 remains at £1+k, save up a bit longer and go 4090.

And if you can do that, make sure to save a bit more and get the strix model. Oh and don't forget to save some more fore the cable :p

How bad is the 4070 really? If under £500 I think it would do the job until next gen is out.
 
Back
Top Bottom