• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5090 Price

Status
Not open for further replies.
Will there even be differences in performance between the 5090 Strix and the FE...?
I've had both FE and Strix. In summary, if you're not overclocking massively and not a huge sticker on temps and noise, there is virtually no difference for 99% of users. The Strix, however, runs discernably cooler (due to its gargantuan hunk-of-metal cooler) and has higher OC potential (better VRMs and likely binning) -- mine did, at least. The Strix is also easier to shunt mod, if that's your thing.
 
Where is this nonsense coming from?

So 16GB isn't enough to last two years but only 4GB more would be fine?

16GB will be more than adequate for a good few years. The 3080 10GB from four years ago still runs 90% of things without any trouble at all.
Will it though? Doesn't Indiana Jones require more than 16gb to run with everything maxed? For a brand new 5080 I would expect it to destroy current titles fully maxed, with ease, surely? Isn't that the point of high end gaming?
 
Will it though? Doesn't Indiana Jones require more than 16gb to run with everything maxed? For a brand new 5080 I would expect it to destroy current titles fully maxed, with ease, surely? Isn't that the point of high end gaming?
Nothing is "supposed" to go in a specific way. If developers push graphics to a certain degree, it will take a few generations to "easily" run a particular game.

As it's always been (Crysis, Witcher 2 and so on).
 
Last edited:
Where is this nonsense coming from?

So 16GB isn't enough to last two years but only 4GB more would be fine?

16GB will be more than adequate for a good few years. The 3080 10GB from four years ago still runs 90% of things without any trouble at all.
If you want to turn settings down, particularly textures, then fine. For the insane price they'll be charging for it, 16GB is not only not enough, it's unacceptable for a GPU that people may have for the next 3 to 5 years before upgrading.

Games are getting more VRAM demanding. The new Indiana games recommend 12GB already at 4K. Last of Us Part 1 uses over 14GB at ultra 4K. Alan Wake 2 as well uses a lot, and so on. And this is for games that are already released, and the new GPU is not even out yet. I’m sorry, but 16GB is just not enough to last the next 3 to 5 years in a lot of AAA games without compromising settings.
 
I've had both FE and Strix. In summary, if you're not overclocking massively and not a huge sticker on temps and noise, there is virtually no difference for 99% of users. The Strix, however, runs discernably cooler (due to its gargantuan hunk-of-metal cooler) and has higher OC potential (better VRMs and likely binning) -- mine did, at least. The Strix is also easier to shunt mod, if that's your thing.
Ok, but apart from overclocking it to run some benchmarks, I can’t imagine finding myself in situations where I would feel the need to overclock a 5090 :eek:
 
Will it though? Doesn't Indiana Jones require more than 16gb to run with everything maxed? For a brand new 5080 I would expect it to destroy current titles fully maxed, with ease, surely? Isn't that the point of high end gaming?

That's the job of the 5090. If the 5080 "destroyed" everything on max settings, why would anyone buy a 5090?
 
That's the job of the 5090. If the 5080 "destroyed" everything on max settings, why would anyone buy a 5090?
But are we saying it's now acceptable to have to turn down settings using a brand new just released high end GPU? I can't get my head around that.

The 5090 is more future proof, and should run things at silly high frame rates, sure, but without it you have to turn down settings now? Wtf.
 
Games are getting more VRAM demanding. The new Indiana games recommend 12GB already at 4K. Last of Us Part 1 uses over 14GB at ultra 4K. Alan Wake 2 as well uses a lot, and so on. And this is for games that are already released, and the new GPU is not even out yet. I’m sorry, but 16GB is just not enough to last the next 3 to 5 years in a lot of AAA games without compromising settings.

Games VRAM requirements are inline with the PS5 (realistically the Pro) so ~16GB + some extra for better textures, more complex BVH structures and typically higher native resolution. I can see stuff bumping up against this soon enough - but I doubt it'll go massively over until PS6 is in the market and hits it's stride - 2028-2029 sort of time.
 
I'd expect the 4080 won't drop much in value? probably still get £800 for it that I paid, because AMD has nothing really and Nvidia have just added 50 series on top of 40 series really.
 
I've had both FE and Strix. In summary, if you're not overclocking massively and not a huge sticker on temps and noise, there is virtually no difference for 99% of users. The Strix, however, runs discernably cooler (due to its gargantuan hunk-of-metal cooler) and has higher OC potential (better VRMs and likely binning) -- mine did, at least. The Strix is also easier to shunt mod, if that's your thing.

I agree that better cooling can help with clock speeds but it's been a very long time since overclocking a nVidia card was even remotely possible beyond serious hardware modifications no? The boost algorithm is what it is and I'm not sure there is such a thing as binning or improved component quality in terms of getting a noticeably better OC.

These days all seems to be about UNDER volting to open up performance within the boost algorithm envelope. I cant see any value in these so called "Ultra Permium" models which are significantly more expensive for no apparent reason.

I'll just be unlinking Temperature and Power Target and putting the slider all the way to the right *shrug*
 
Last edited:
Torn between getting a FE straight away or waiting for AIBs with better cooling. By all accounts it’s going to be beast on power consumption and I game either on speakers or open back headphones so don’t fancy a jet engine cooler on it. But waiting for the AIBs is both more costly and a longer wait…

My 3080 isn’t happy anymore since I upgraded to 4k, it’s just completely ran out of grunt at settings levels I’m ok with even with DLSS. 100 FPS is ok in some games but feels sluggish in others…

Decisions, decisions
 
Torn between getting a FE straight away or waiting for AIBs with better cooling. By all accounts it’s going to be beast on power consumption and I game either on speakers or open back headphones so don’t fancy a jet engine cooler on it. But waiting for the AIBs is both more costly and a longer wait…

My 3080 isn’t happy anymore since I upgraded to 4k, it’s just completely ran out of grunt at settings levels I’m ok with even with DLSS. 100 FPS is ok in some games but feels sluggish in others…

Decisions, decisions
The 4090 fe cooler was really good, go for it IMO. Under intense conditions (in my well ventilated case) it's maybe not silent but that's only a small percentage of time, for the majority of time it actually is.
 
Torn between getting a FE straight away or waiting for AIBs with better cooling. By all accounts it’s going to be beast on power consumption and I game either on speakers or open back headphones so don’t fancy a jet engine cooler on it. But waiting for the AIBs is both more costly and a longer wait…

My 3080 isn’t happy anymore since I upgraded to 4k, it’s just completely ran out of grunt at settings levels I’m ok with even with DLSS. 100 FPS is ok in some games but feels sluggish in others…

Decisions, decisions
I always found the FE cooling to be pretty decent on both my 3080FE and 4080FE, never had any issues.
 
I'd expect the 4080 won't drop much in value? probably still get £800 for it that I paid, because AMD has nothing really and Nvidia have just added 50 series on top of 40 series really.
I suspect it will get pulled down some as you have lower-priced cards nipping at its heels e.g. 9070xt (and possibly 5070), and a faster card that is near in price e.g. 5070 ti. I doubt used 4080s will be selling for £800 by end of Jan.
 
Status
Not open for further replies.
Back
Top Bottom