Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
What matters more is the performance stack and the price per £:
![]()
I am not sure i follow this post properly.
I don't get how you get to the conclusion the the 4090 is 'flaccid' as it is still the 2nd best card on the market and has been the best for over 2 years.
So lets say its the second best GPU on the market for the next 2 years (lets forget they are probably bringing out a 5080Ti) thats 4 and a bit years in total of being in the top 2.
And this is even before considering the generational uplift from the 3000 series which was over 60%.
Now in 2 years(ish) time we are expecting the 6000 series, this is probably going to be on a new node and expected to give good performance uplifts are per previous experience.. i don't see a repeat where the 5090 will be the 2nd best card on the market then. So therefore is the value the same? This is why the term 'Aging like fine wine' is being used for the 4090.
I wouldn't class a 30%ish average as miles ahead but i guess thats subject to opinion.All of the 'mega benefits' that the 4090 enjoyed have been lost. It's no longer the fastest (5090 is miles ahead --> see link), it's only a smidge faster than the 5080 (when overclocked), even at launch it hasn't offered anything better when it comes to 'price per frame' compared to the 50 series.
I am just jesting / joshing though with the 'flaccid' comments. 4090 is a good card and is probably one to hold onto for now
... but, if you are holding onto a 40 series card for another 2 years, then the whole 'retain it's value' point is going in the bin, because it'll have a low resale value by the time the 60 series comes out by virtue of it being 2 gens old (similar-ish to the 30 series resale value now).
Nobody knows how well the 50 series will hold compared to the 60 series. Even if the 6080 beats it, which I hope it would, it will still offer the same performance as it did today... just as the 4090 offers the same performance today.
All of the 'mega benefits' that the 4090 enjoyed have been lost.
How dare you suggest having a fair comparison, just get out until you learn to be more biasThe 4090 can be overclocked too, need an apple 2 apples comparison.
And the 5080 is the joker.The 4090 is the queen and the 5090 is the new king?
The 4090 can be overclocked too, need an apple 2 apples comparison.
Aren't you forgetting VRAM?
But really, who is surprised or even concerned? Technology marches onwards and upwards.
Oooh I need to test brbI see the new drivers offer DLSS4 stuff - saw someone say it was really good on his 4090 with CYBERPUNK
4090 will reign for a long time i think - and its lovely under water.
not sure how we are going to cool a 5090FE under water
You can see where he's coming from, but the 50 series is on the same node as the 40 series so was never going to be a huge increase.The RTX 6080 apparently might not beat the 4090 going by the current trend
Not really up to date with cutting edge graphics tech anymore, but wouldn't the 6 series offering [reasonable] (I realise this is subjective) performance gains for significantly less power draw be a good/useful step?The RTX 6080 apparently might not beat the 4090 going by the current trend![]()
I thought the £1800 price of some of them was totally nutsAnd the 5080 is the joker.
I thought the £1800 price of some of them was totally nuts
And then i saw this one
![]()
Gigabyte GeForce RTX 5080 Aorus Xtreme Waterforce WB 16GB GDDR7 PCI-Express Graphics Card
Order Gigabyte GeForce RTX 5080 Aorus Xtreme Waterforce WB 16GB GDDR7 PCI-Express Graphics Card now online and benefit from fast delivery.www.overclockers.co.uk
The 60 series is on a new node so it will naturally see a bigger increase in performance
All the companies buying 5 series cards, when they're actually available, for processing applications rather than games - rather than enthusiasts gaming and such - will want lower power usage, wouldn't they?