• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

I'm not saying you're wrong and that the 5080 is a good deal, because it's not, but you need to factor for inflation as £650 in Sep 2020, which is when the 3080 came out, would be £815~ now (source). If the 5080 was priced at £750 as you suggested it would actually be like a 3080 costing £560~ on release.
The 3080 is not the same class of gpu as the 5080 though, if your going to bring inflation into it you need to calculate from the price of the 3060ti which was £369.
 
The thought of having to pay +45% more money (at least) for +65% improvement doesn't seem all that appealing. To me, that isn't progress, especially not for a 4 year wait.
the problem with using averages is that it doesn't tell the whole story, like with my 3080 I was having to run pretty conservative settings AND use like DLSS balanced or performance in a whole bunch of games that I am actually playing, which don't tend to be included in benchmarks or not at the settings I actually use to get playable frame rates

with the 5080 I can now play everything in my library maxed out, like you can cherry pick x game that runs poorly but then it tends to run pretty poorly on everything not just the 5080

I've gone from being able to play on middling settings with DLSS cranked up, to max settings with DLSS native or maybe quality for the most part

on my setup, in the games I play at 1440UW, its near enough double the performance
The 3080 is not the same class of gpu as the 5080 though, if your going to bring inflation into it you need to calculate from the price of the 3060ti which was £369.
okay but only if we take in to consideration that Samsung 8nm was around $5000 per wafer and TSMC 4 is around $16000 per wafer (with 3nm reportedly going to be $20000)
 
I'm not saying you're wrong and that the 5080 is a good deal, because it's not, but you need to factor for inflation as £650 in Sep 2020, which is when the 3080 came out, would be £815~ now (source). If the 5080 was priced at £750 as you suggested it would actually be like a 3080 costing £560~ on release.

Fair enough about the inflation point on absolute costs, but the last few years the rate of inflation has been higher than the historical average for the last decade, so the figures have been skewed somewhat.

Equally, I don't really care what the card is called in reality. I don't go by that metric, I simply go by cost/performance metric, and after 4 years, there still isn't any card that offers at least a 50% uplift in (real) performance for £650, and that's pretty poor going.

the problem with using averages is that it doesn't tell the whole story, like with my 3080 I was having to run pretty conservative settings AND use like DLSS balanced or performance in a whole bunch of games that I am actually playing, which don't tend to be included in benchmarks or not at the settings I actually use to get playable frame rates

with the 5080 I can now play everything in my library maxed out, like you can cherry pick x game that runs poorly but then it tends to run pretty poorly on everything not just the 5080

I've gone from being able to play on middling settings with DLSS cranked up, to max settings with DLSS native or maybe quality for the most part

on my setup, in the games I play at 1440UW, its near enough double the performance.

That's fine, I wasn't aiming anything at you. You have a different set of criteria when parting with your money than me, if it's made you happy then great, enjoy.

For me, I'm a stubborn **** who hates parting with money on any level, especially if I don't get what I want for the price I expect :D
 
HUB said in their podcast they heard the current 5080 die is actually cheaper to manufacture this time vs the 4080. You would hope so given the process is the same and I cannot imagine 4nm capacity being as in demand this time as more and more customers switch to 3nm.

Also, there is no way the memory costs $300+, as someone else said.
I agree, It should be cheaper, Vram is cheaper and so should the die be because it hasn't changed technology so it's just Nvidia being greedy again.
 
Guys,

Any advice on which 5070 Ti to go for ?


Or another that's not on that list ?

Many thanks
I would wait to see how it and the 9070xt stack up against each other before committing, likely be similar performance but hopefully be a sizable price difference.

Other than that if you're determined to get a 5070ti as early as possible, I would go for the cheapest one in stock at the time of purchase, judging by the 5080s the prices are going to all be inflated and vary wildly from sku to sku. And practical difference between them will be negligible. No point paying 100s of extra pounds for at best a couple % performance.

Either that or just wait till stock settles down. I honestly think it won't be very long. Maybe 2 or 3 months and we'll hopefully see better stock levels and better prices and the earliest adopters will hopefully have figured out if there are any cards that genuinely stand out as being better or worse than the pack
 
The 3080 is not the same class of gpu as the 5080 though, if your going to bring inflation into it you need to calculate from the price of the 3060ti which was £369.

The 3080 was an outlier as Nividia were stuck on Samsung and AMD were rumoured to be competitive with the 6000 series.

680, 980, 1080, 2080, 4080 and 5080 all used a 256 bit memory bus. Due to modular design this largely sets the cuda count of the generation.
For 3080 they were forced to use 320 bit to get the performance then needed which resulted in only a small performance gap to the 3090 halo card hence why 3080 was such an amazing deal.
It's 12 years since Nvidia launched 80 class GPU with >256 bit bus and that was the 780 in 2013.

Nvidia hit a wall this year, on the same process node, they are already at 360W for a 256 bit 5080.
Scaling isn't great, +100% memory bandwidth, +100% die area +60% power = +50% performance on the 5090 (Tech power up)
Any decent uplift in the 5080 would be a 4090 in all but name and at 4090 pricing. Anyone with that budget would buy the 5090 anyway making it DOA.
Why would Nvidia take a hit on their margin when they can sell the same die as a data center SKU.

This generation is a disappointment, it's basically a refresh across the stack with the only real improvement an even larger halo card which they could have built 2 years ago.
These are data center cards, where the data centre features are given a gaming spin and sold to us as gaming cards.
AI TOPS have doubled shader for shader but there is nothing much here for gamers other than the stagnation you get when you don't have competition.
14nm+++++ ?

The 5080 aligns with what the 80 series has been for the majority of the period since the GTX 680 launched in 2012.
 
I've said it before and i will say it again.. just because the rate of inflation is high it doesn't mean peoples wages have increased to match it, if anything people are now looking more than ever for value for there money. £1000 pound for someone 4 years ago could still hold the same value to that person today, i dont know anyone that thinks 'okay so £1000 4 years ago was this much value to me, so now i need to compensate for inflation so now £1400 holds the same value as what £1000 did'.
 
Last edited:
More and more people are finding their FE 5090/5080 can't run PCIE5.0


So much for Nvidia's proprietary design
I did wonder about that. I have seen some very strange behaviour with riser cables that has only been solved by dropping it down to a lower gen. than it's rated for. I did wonder if the connection on the founders card would play silly buggers with signal integrity
 
Last edited:
Inflation goes up so costs go up on all sides… Sounds like it does play a big part, especially in recent years, no?
Yeah, its probably a bit strongly worded saying 'completely invalid' but the point is still the same underneath

(i've deleted the first bit as yes its incorrect)
 
Last edited:
More and more people are finding their FE 5090/5080 can't run PCIE5.0


So much for Nvidia's proprietary design

Debauer was saying there was not enough testing done and now the **** is hitting the fan.

They all seem to be on AM5 in that thread.

Well ryzen dominates diy so not surprising. Won't have any relevance to dodgy FE connection IMO.
 
Back
Top Bottom