• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4090 - actually future proof?

People who buy the halo products are almost always not the smartest of people hence why they need to ask if it is futureproof or they go full on out there and say it will last 10 years.

The last Halo card I owned was the GTX 480, it lasted 4 years before settings had to be turned downto truly get 60 FPS at 1080P then it died.
Battlefield 3 came out in 2011, a game that card could not max and stick to 60 FPS. The 480 was out in 2010.

 
Last edited:
The HD5800 series was a huge jump over the GTX200 and HD4000 series as well. Even the subsequent gen GTX460 1GB and HD6850 were still around 10% slower than the HD5850.
The GTX 470 murdered the 5850 which was it's direct competitor, both the 470 and 480 were a major force to be reckoned with in proper DX11 titles besides the Crysis 2 debacle.

Both also improved heavily in performance from overclocking, the 480 could beat the GTX 580 at 850mhz vs the 580's 775mhz, this was when shader was tied with core clock and was known as hot clocking, Fermi had amazing IPC.

3545_26.png



3545_28.png


3545_34.png


460 almost matches the 5870 overclocked.

3545_50.png
 
Last edited:
I am looking forward to the 7900xt that may come out- reason being 4090 still runs gen 4 but apparently amd are pushing the use of gen 5 which will change the game.

Running at Pcie 3.0x16 vs 4.0x16 barely makes a difference even with a 4090.

Pcie 5.0 is really for storage at the moment and it will be a while before its relevant for graphics cards....

 
Last edited:
Which person was claiming Gen5 will be a game changer? haha Despite doubling it's performance from Ampere, Lovelace barely eats into PCIE4 bandwidth and you can easily run a RTX4090 and get near full performance with just PCIE3x16.

PCIE5, like 4, was generally designed to accommodate new SSD's, not graphics cards. Adding higher PCIE generation compatibility to your graphics card increases the manufacturing cost and who wants to pay for additional PCB layers and trace insulation when it doesn't add performance.

And goodluck finding working Gen 5 riser cables, and spending a pretty dime on it to get no extra performance from it anyway
 
Last edited:
People who buy the halo products are almost always not the smartest of people hence why they need to ask if it is futureproof or they go full on out there and say it will last 10 years.
And you consider yourself among the smartest as you haven't bought a 4090? :rolleyes:
Funny, i've not seen 1 comment that a 4090 will last 10 years :rolleyes:
 
Last edited:
And you consider yourself among the smartest as you haven't bought a 4090? :rolleyes:
Funny, i've not seen 1 comment that a 4090 will last 10 years :rolleyes:
maybe evem 6, 7, 8 years

It may last a decade

One is the OP saying 8 years, then a few posts down someone suggests 10 (a decade), I mean 8 years is close enough that it's a bit petty to argue over 8/10 years as for all intents and purposes it's the same thing.
 
The studio is known for building games with bad graphics that run terrible.

Optimisation is a wonderful thing, it's much easier to make a game that runs at 10fps on a GPU than it is to get 100fps. Take for example the recently announced Resident Evil 4 remake which has significantly better graphics than what was shown for the silent hill 2 remake and yet its system requirements are far more lenient and target higher framerates.
 
One is the OP saying 8 years, then a few posts down someone suggests 10 (a decade), I mean 8 years is close enough that it's a bit petty to argue over 8/10 years as for all intents and purposes it's the same thing.
ok point taken, i hadn't seen those comments and i think they are wrong, a 4090 won't be "good" in 10 years, not anywhere near that imo.

stil, know kneed to cawl me fick cos i bort a 4090 init :D
 
  • Haha
Reactions: TNA
And you consider yourself among the smartest as you haven't bought a 4090? :rolleyes:
Funny, i've not seen 1 comment that a 4090 will last 10 years :rolleyes:
I never said much about myself, if your brain was switched on, I owned a Halo card.. it's called Wisdom not intelligence.
But people who perpetually buy Halo cards are amongst some of the dumbest, but not all, some use them for professional work.
These folk buying the card for say 1440P and wanting it to last don't realize that by the time the 4090 has aged enough to start showing slow downs on 4K and very good peformance on 1440P, a new card is out cheaper with the same or better performance.

Not just Halo cards, being on the fringe in the very shallow minority sets you apart but also sets you up for some serious cash loss. A rich persons world.
No one on here with a 4K screen is happy with their performance, they all want the next best thing, but this is not told to people with lesser budgets, some just will go out and buy the Halo card and ask for people to aid them afterwards so they don't feel bad after dropping 2K.
 
Last edited:
The GTX 470 murdered the 5850 which was it's direct competitor, both the 470 and 480 were a major force to be reckoned with in proper DX11 titles besides the Crysis 2 debacle.

Both also improved heavily in performance from overclocking, the 480 could beat the GTX 580 at 850mhz vs the 580's 775mhz, this was when shader was tied with core clock and was known as hot clocking, Fermi had amazing IPC.

3545_26.png



3545_28.png


3545_34.png


460 almost matches the 5870 overclocked.

3545_50.png
I was just generally talking generational performance jump at the time and was not really focusing on talking about same gen vs same gen. I didn't mentioned the GTX470 and GTX480 because they were on higher price bracket, and the architecture didn't really optimised until the GTX500 series (which was pretty much the relaunching of the Fermi, but with the the power consumption and temperature sorted out).

Also I don't know if you remember, but Nvidia's GTX470 and GTX480 came like 6-8 months later than the release of the HD5850/5870 (meaning the flagship GTX285 that was the Nvidia's only card available at the time to compete against HD5850/HD5870 that have dx11 support) was the first dx11 cards that came to the market). Being late to the party and extremely hot-running aside, their launch prices were certainly not as direct competitor to the HD5800 series; with the HD5850 selling at £250~£280 and the HD5870 was selling at £310~£330, while Nvidia launched their GTX470 at £370 and their GTX480 at £480-£500.

Realistically speaking in terms of price and performance, it was really only the much later released Nvidia GTX460 1GB and AMD's own HD6850/HD6870 that could compete with the HD5850.
 
Last edited:
I never said much about myself, if your brain was switched on, I owned a Halo card.. it's called Wisdom not intelligence.
you didn't have to explicity say it, u implied people who buy 4090's are 'fick :D, and i assume u never bought one so that means u aren't fick, correct?
what a sweeping generalization about 4090 owners :rolleyes:
 
you didn't have to explicity say it, u implied people who buy 4090's are 'fick :D, and i assume u never bought one so that means u aren't fick, correct?
what a sweeping generalization about 4090 owners :rolleyes:

4090 Fick Edition! Thicc is more like it with those Titan-ic proportions. :D
 
I was just generally talking generational performance jump at the time and was not really focusing on talking about same gen vs same gen. I didn't mentioned the GTX470 and GTX480 because they were on higher price bracket, and the architecture didn't really optimised until the GTX500 series (which was pretty much the relaunching of the Fermi, but with the the power consumption and temperature sorted out).

Also I don't know if you remember, but Nvidia's GTX470 and GTX480 came like 6-8 months later than the release of the HD5850/5870 (meaning the flagship GTX285 that was the Nvidia's only card available at the time to compete against HD5850/HD5870 that have dx11 support) was the first dx11 cards that came to the market). Being late to the party and extremely hot-running aside, their launch prices were certainly not as direct competitor to the HD5800 series; with the HD5850 selling at £250~£280 and the HD5870 was selling at £310~£330, while Nvidia launched their GTX470 at £370 and their GTX480 at £480-£500.

Realistically speaking in terms of price and performance, it was really only the much later released Nvidia GTX460 1GB and AMD's own HD6850/HD6870 that could compete with the HD5850.
In December 2010 a competitior with a blue logo that shares a similar name to Scania the truck manufacturer was selling 480's at 250 quid. Was the only reason I got one, it taught me a lesson on being at the very top of GPU perf though, you won't ever be satisfied.

You are right though they did come out later, both were first attempts at DX11 too.
 
Last edited:
In December 2010 a competitior with a blue logo that shares a similar name to Scania the truck manufacturer was selling 480's at 250 quid. Was the only reason I got one, it taught me a lesson on being at the very top of GPU perf though, you won't ever be satisfied.

You are right though they did come out later, both were first attempts at DX11 too.
Don't get me wrong, GTX480 was a beast that was let down by combination of high power consumption (heat) and the stock cooler that was not up for the job (way too hot way too noisy). People that managed to get a 3rd party cooler fitted on it to tame its noisy and heat had an amazing card (of we could overlook the initial overpricing).

I'm pretty sure the £250 for the GTX480 was the EOL pricing as the GTX580 and GTX570 were launched in November/December 2010. I recalled OcUK also have some limited runs of GTX470 and GTX480 being sold at EOL price of something like around £170 and £199.99 around the time of release of the GTX580 and GTX570.
 
Last edited:
Here we are, a Youtuber who has committed the sin of calling the 4090 'incredibly futureproof' for 1440p (which can be generalised proportionately to UWQHD)


Mostly I recall 2016's 1080 Ti, and it wasn't until 2019 cards like the Radeon VII, RX 5700 XT, and RTX 2080 could match it 3 years later. That 2016 $700 RRP retrospectively became a bargain, and that was in a time when it and its 2019 competitors had no screwed up inflated economy as a backdrop.
 
Here we are, a Youtuber who has committed the sin of calling the 4090 'incredibly futureproof' for 1440p (which can be generalised proportionately to UWQHD)


Mostly I recall 2016's 1080 Ti, and it wasn't until 2019 cards like the Radeon VII, RX 5700 XT, and RTX 2080 could match it 3 years later. That 2016 $700 RRP retrospectively became a bargain, and that was in a time when it and its 2019 competitors had no screwed up inflated economy as a backdrop.
You made that 1080 Ti look better than it actually is.
1080 Ti was out before the new consoles released.
The 1080 Ti's so called superiority is because you were all mostly playing console ports of XBOX ONE and PS4.
The same thing happened with XBOX 360 and PS3 games until the launch of the XBOX ONE and PS4, a GTX 480 could get you through the entire gen.


Just like the 1080 Ti which can now be bested by an RTX 2060 in some games though single digit numbers the 1080 Ti can play some cross gen games very well as did the GTX 480.
piimKPY8cHzWp4JwYUYCjY-1200-80.png

The pattern is exactly the same.

 
Last edited:
Back
Top Bottom