• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
I mean... at the very least it looks like AMD are going to bring out a faster card than the 2080ti with probably an MSRP closer to half the 2080ti so 2080ti owners are gonna get shafted either way... it has been that way more often than not in GPU generation changes so I don't have much sympathy.

2080Ti owners were screwed the day they bought their cards. They got the same generational improvement as usual but paid almost double to get it.
 
2080Ti owners were screwed the day they bought their cards. They got the same generational improvement as usual but paid almost double to get it.

As is always the case with cards like this - either commit early or don't buy if you care about VfM, etc. (though these aren't really cards for people who care about VfM anyhow) people who bought the Titan cards, etc. on release often got some good mileage out of them - those that dithered and bought later not so much.
 
No kidding, 2080ti owners in particular got shafted hard, less than two years after giving them over a grand. This launch has been beyond disastrous, even when you consider the pandemic.
To be shafted, or not to be shafted, that is the question?
Is it they got shafted harder for buying a 2080ti for $1200 and it's beat by a non ti 3080 for far less?
Or
Is it they got shafted harder for selling the 2080ti between 1/4 to 1/3 the price they paid and, now in cue for a 3080 with no eta?
Or
Is it they got shafted harder buying a 2080ti for $1200, selling it at 1/4 the price for a non ti 3080 and, are now waiting in cue for that 3080 with is rumored to be EOL?

I can't wait for the movie!!!! :D

Gamers are hoping Lisa Su is going to ride in on a massive white stallion, hold up a bignavi2 card like a shining beacon and save us all from this high-price-no-stock Nvidia nightmare.
That just might happen. It just might happen. But I wouldn't hold my breath.

Personally, I don't see it happening. For the next six-twelve months, the consoles are where the party is at if you are a gamer. If you are only interested in high frames per second, then prepare to pay a price and/or wear out your f5 key.
You can get a console interest free for 24 months.
https://techcrunch.com/2020/09/09/m...hly-financing-plan-for-its-new-xbox-consoles/

Looks like they have cookies...Party over here!!!! :D
 
As is always the case with cards like this - either commit early or don't buy if you care about VfM, etc.

That "logic" doesn't hold up when you step back and look at...well...pretty much EVERY generation other than Turing.

Turing (more specifically the 2080Ti) was not, and is not, "always the case".
 
So thinking about it further, 190W for entire XSX.
The CPU is basically a 4800U that's at a static frequency, a 4800U when running cinebench according to notebookcheck is 50W, but it boosts itself higher, plus gaming uses less CPU than cinebench. Looking at a few places the core power draw is just over 40W when gaming, so for the XSX, the CPU portion is maybe 30-35W.
Take off everything else within the console that it requires to run than a GPU wouldn't need.
Would give about 130W for the GPU itself then memory on top of that.
A 130W GPU that has the power of about a 2080Super.

It gets crazier when you compare things, the XSX has 30% more CUs than a 5700XT with -5% frequency, so theoretically 25% more performance. Using techpowerups performance summary, you need 25% more performance on top of a 5700XT to get to the 2080Super, so right in line.
But the XSX only uses 130W, not the 220W the 5700XT does. To make a 5700XT 25% faster would need 25% more power in a bigger RDNA1 card rather than just OCing it, could be done with more CUs etc. 220W plus 25% = 275W. An RDNA1 card with the power of a 2080Super would use about 275W. The XSX uses ~130W to do the same. That's about a 100% performance per watt improvement. The ****.
 
As is always the case with cards like this - either commit early or don't buy if you care about VfM, etc. (though these aren't really cards for people who care about VfM anyhow) people who bought the Titan cards, etc. on release often got some good mileage out of them - those that dithered and bought later not so much.

Same with 3090s today. There are enough hints rumours around that there will be a faster 3080ti out next year for probably £500 less than they have paid for their 3090. Thats worse deflation than if you bought a 2080ti at launch.
 
That "logic" doesn't hold up when you step back and look at...well...pretty much EVERY generation other than Turing.

Turing was not, and is not, "always the case".

I'm not even sure what you are saying there - maybe missed the point of my post.

Same with 3090s today. There are enough hints rumours around that there will be a faster 3080ti out next year for probably £500 less than they have paid for their 3090. Thats worse deflation than if you bought a 2080ti at launch.

There is no reason to buy a 3090 unless you either need the feature set (some content creators) or absolutely must have the best at any cost. Makes me laugh when people complain about the cost of them.
 
I'm not even sure what you are saying there - maybe missed the point of my post.

Maybe you missed the point of *my* post?

The 780Ti, 980Ti, and 1080Ti all offered a decent uplift over previous gen with out the ridiculous price increase the 2080Ti brought.

The 2080Ti was not just the same old "high-end stuff is expensive" offering. It was a rip-off.

danlightbulb's chart:

EknO3Ja.png
 
Last edited:
Not sure most 3080’s offering just 20% more than a 2080Ti are THAT fantastic.

Maybe at £650 they’re pretty good, but some people are paying >£800 for one at the moment, then waiting god knows how long for it to turn up.

The 3090 is a bigger rip off than the 2080Ti IMO. Fancied a Strix OC 3090 but not at £1700. :eek:

Now I’m very much looking forward to whatever AMD bring to the table instead.
 
Not sure most 3080’s offering just 20% more than a 2080Ti are THAT fantastic.

Maybe at £650 they’re pretty good, but some people are paying >£800 for one at the moment, then waiting god knows how long for it to turn up.

The 3090 is a bigger rip off than the 2080Ti IMO. Fancied a Strix OC 3090 but not at £1700. :eek:

Now I’m very much looking forward to whatever AMD bring to the table instead.

A 20% plus increase in performance on an already superb platform like the 2080TI is a big increase imho... it was always the stupid price of the 2080TI that was always questioned (in my mind)... as £1100+ for a card that offered 40% performance more than my 5700XT maximum... and I paid around £330 for my 5700XT the maths just didn't work out. No matter what way we peel it, 2080TI was an expensive luxery... but the best card in the world.

However if we're looking at pure performance the 2080TI is still fantastic and so is the 3080... the 3090 not impressed at all really tbh, yes it's faster but barely... sure when the 3080 becomes overclockable you'd get 3090 performance once manufactoring gets better and we actually see the 3080 out in the wild. I really don't know why nVidia even released the 3090 from day one... bravado, not sure but I just don't see where they can go performance wise for 12 months... they're kinda at their limit (I think).
 
Not sure most 3080’s offering just 20% more than a 2080Ti are THAT fantastic.

Maybe at £650 they’re pretty good, but some people are paying >£800 for one at the moment, then waiting god knows how long for it to turn up.

The 3090 is a bigger rip off than the 2080Ti IMO. Fancied a Strix OC 3090 but not at £1700. :eek:

Now I’m very much looking forward to whatever AMD bring to the table instead.

I think the 3090 is priced for people that bought into the 2080Ti's generational price/performance offering.

The 3090 offers more of a performance improvement over last gen than the 2080Ti offered, and does so for a smaller price increase over last gen than the price increase the 2080Ti offered.

Had Nvidia offered only the 3090, people who had been lecturing us on how more performance was just going to cost more money, (because "Nvidia is not a charity", "You gotta pay to play", and "if you want the best" etc.) would have just said "we told you so."

But Nvidia offered something to those of us that saw the 2080Ti as a rip-off because ~30% generational improvement has historically been offered around last-gen's price points. (not almost double last gen)

Without the 3080, the 3090 would just be this gen's 2080Ti.
 
Maybe you missed the point of *my* post?

The 780Ti, 980Ti, and 1080Ti all offered a decent uplift over previous gen with out the ridiculous price increase the 2080Ti brought.

The 2080Ti was not just the same old "high-end stuff is expensive" offering. It was a rip-off.

danlightbulb's chart:

EknO3Ja.png

Looks good and all but the issue with comparing it to the 2080S is that the 2080S uses the TU104 die, not the top Turing die TU102 that was used for the 2080Ti, where the 3080 uses the GA102 die, the top Ampere die. The 2080S should really be compared to the 3070 when it comes out seeing as it uses the GA104. And the 3080 should only get compared to the 2080Ti.
 
No kidding, 2080ti owners in particular got shafted hard, less than two years after giving them over a grand. This launch has been beyond disastrous, even when you consider the pandemic.

I didn't give them a grand for mine I paid well under that. Prices go up and down though just before the 3080 launch the same cards were selling for a crazy £1200. And they're worth barely half that now. Sold my 1080ti not long ago for half what I paid for it two years earlier which is not a bad return.
 
Last edited:
The 3090 offers more of a performance improvement over last gen than the 2080Ti offered, and does so for a smaller price increase over last gen than the price increase the 2080Ti offered.

No the benchmark has changed...
It's the 3080 now..
And folks do sometimes get bored rocking the same hardware for 2 years..
 
Looks good and all but the issue with comparing it to the 2080S is that the 2080S uses the TU104 die, not the top Turing die TU102 that was used for the 2080Ti, where the 3080 uses the GA102 die, the top Ampere die. The 2080S should really be compared to the 3070 when it comes out seeing as it uses the GA104. And the 3080 should only get compared to the 2080Ti.

All that matters is: How fast is it? How much does it cost? the 2080Ti offered a run-of the mill generational performance increase for an epic rip-off price increase.

And I was comparing it to the 1080Ti, not the 2080S.
 
Last edited:
No the benchmark has changed...
It's the 3080 now..
And folks do sometimes get bored rocking the same hardware for 2 years..

I thought I made it clear that I was using generational improvement as a metric. I also pointed out that the difference between the 2080Ti and the 3090 is that the 2080Ti didn't have another card in the same stack highlighting what a rip-off it was.
 
I thought I made it clear that I was using generational improvement as a metric. I also pointed out that the difference between the 2080Ti and the 3090 is that the 2080Ti didn't have another card in the same stack highlighting what a rip-off it was.

Decisions are made with information available in present tense.. and you also have to consider the specific user setup from where the upgrade decision is being made.. your arguments are assuming time to be a free resource, that's another fallacy..

3080 is the new benchmark.. youve got to make all your decisions factoring this new reality, your current setup and not some complicated method of generational change and stuff like that.

Rest is up to you.
 
Decisions are made with information available in present tense.. and you also have to consider the specific user setup from where the upgrade decision is being made.. your arguments are assuming time to be a free resource, that's another fallacy..

3080 is the new benchmark.. youve got to make all your decisions factoring this new reality, your current setup and not some complicated method of generational change and stuff like that.

Rest is up to you.

When the 2080Ti was released, I owned a 1080Ti that was just as fast as the 2080. Turing offered me performance I already had for money I had already spent. Oh, and the opportunity to beta test their ray tracing.

The 2080Ti offered the performance I expected at a price that was historically terrible.

Even if you have the memory of a goldfish, pretend that you were born the day that Ampere launched, and none of the previous generations ever happened, the 3090 is still offering more performance for more money. Without looking at previous generations to form a point of reference, you are just left with "More performance costs more money." and the way that is supposed to look can vary wildly from one person's opinion to the next.

Looking back at previous generations seems like a reasonable way to mitigate the subjective nature of what performance is "worth".

Who's to say that 15% more performance shouldn't cost more than twice as much when it's the absolute best available? Well, previous generations, that's "who".

Heck, using the goldfish approach, all Nvidia needs to do to make the 3090 a good value is raise the price of the 3080 to $1399.

Boom. Done. The 3090 is suddenly "worth the money". lol
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom