• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
The 3090 looks a lot like this generation's Titan.

The tier game is just that, a game. The 2080Ti was not the fastest Turing card. If people thought the 2080Ti was Turing's "flagship" when Nvidia sold another Turing card that was faster, then that same "logic" can be applied to the 3080 being the "flagship" even though there is another card that's turned the vram up to 11 and is a tiny bit faster in the stack....looks a lot like Turing's name game. (Just scribble out the word "Titan" and write the number "3090"in crayon.)
I don't want to start the argument up again so I'll just say this: you're missing the point. Yes, Titans exist as far back as Kepler and they were never billed as the flagship gaming card. That was always the Ti. The point here is Nvidia pulling a mind game with the Ampere's marketing which a lot of people are falling for and fawning over themselves and Uncle Leather Jacket. The narrative is suddenly after 5 generations, a non-Ti card is the flagship model with all of the intentional and distorted comparisons that brings. Of course Ampere is this "massive" leap in price and performance over Turing because the flagship is almost half the price and 50% faster. Who could ask for more, it's amazing! But it's just smoke and mirrors to yet again actually increase prices. Pushing the 3080 as the flagship has sent people scurrying to buy a £700 card that just doesn't exist in any practical terms, forcing them into paying at least £150 more for a version that they can get. Now granted that's still a solid move up from the 2080 Ti, but it's still a distortion and deflection intended to dupe the ill-informed, and fans the flames of idiot fanboyism and brand loyalists.

The simple fact that you've said to me "but the tier game is just that, tiers mean nothing" perfectly illustrates this. It doesn't matter if its a flagship or not, the 800 class card is more expensive yet again. But as long as people say "it's the flagship" it apparently doesn't matter and idiots will blindly open their wallets. The tier "game" is not a game, it's very real product segmentation.
 
The only available Radeon RX 5700 XT:

My basket at Overclockers UK:
Total: £489.89 (includes shipping: £9.90)


GTX 2070S:

My basket at Overclockers UK:
Total: £409.85 (includes shipping: £9.90)

Performance at UHD 2160p [Averages across 23 games]:


https://www.techpowerup.com/review/gigabyte-geforce-rtx-3090-eagle-oc/32.html

There are cheaper RX 5700 XT but out of stock and no information about availability.

Not sure where the 11.4% comes from, surely the 2070 Super is 5% better than the 5700XT?
 
Need. More. Leaks.
iu
 
Mathematics, dear, Maths, and it's from the lower grades, though :D

The formula to calculate percentages is
[(49-44)*100]/44

But we're not adding percentage on, were expressing that the 2070 Super is 5% better than the 5700 XT based on the same baseline that the 49% & 44% were calculated on.
 
But we're not adding percentage on, were expressing that the 2070 Super is 5% better than the 5700 XT based on the same baseline that the 49% & 44% were calculated on.

But the baseline is the 3090... so the 2070S is showing as 5% of a 3090 better than the 5700XT.... That's ENTIRELY different to your statement :p
 
Subtract 50% from 100 gives you 50.
Add 50% to 50 gives you 75.
To get from 50 to 100 you have to add 100%.

If GPU X is 50 FPS and GPU Y 75 FPS the difference is not 25%, its 50%, the 50 FPS GPU needs to be half as fast again (50% faster) to match the 75 FPS GPU.

The easiest way to do this is to divide the faster GPU by the slower GPU, so you divide 75 FPS by 50 FPS (75 / 50 = 1.5) the .X, in this case .5 is your percentage. You can qualify it by adding your 50% result to the 50 FPS GPU, 50 + 50% = 75 FPS.

Its really simple :)
 
Anyone else seeing the irony in Humbug trying to teach percentages? Though in his example he is right. But please folks, let's keep this for info on "big" Navi". Yes I also get the irony of posting off topic to ask we keep it on topic. :)
 
He keep's posting TPU's 4K results over and over again knowing Navi 10 has particularly bad 4K results, because it not a 4K card, he knows this.
TPU also don't test older cards on new drivers.

I know then we get lectures on math's. Maths is the distraction from the cherry picked bench. The cycle continues.

Mathematics, dear, Maths, and it's from the lower grades, though :D

:D
 
Anyone else seeing the irony in Humbug trying to teach percentages? Though in his example he is right. But please folks, let's keep this for info on "big" Navi". Yes I also get the irony of posting off topic to ask we keep it on topic. :)
He'd be right if he was talking about the percentage increase, not the difference.

the difference between 50 and 75 is 25. 25 is 50% of 50, but 33.33r% of 75.... so...
 
Anyone else seeing the irony in Humbug trying to teach percentages? Though in his example he is right. But please folks, let's keep this for info on "big" Navi". Yes I also get the irony of posting off topic to ask we keep it on topic. :)

It is info on Big Navi, its also not off-topic :P We are speculating how big navi might compare to the only thing we can compare it to based on rumours.

If you don't like speculation well fine ignore it and the constant supply of rumours, that's all we are getting until late this month, maybe stay out of the thread until then.

I know then we get lectures on math's. Maths is the distraction from the cherry picked bench. The cycle continues.



:D

Oh for sure, i know Steve retested every GPU on his slides with the latest drivers, that makes them more relevant than TPU's slides who recycle old data, i am indeed cherry-picking, the most accurate and up-to-date information.

He'd be right if he was talking about the percentage increase, not the difference.

the difference between 50 and 75 is 25. 25 is 50% of 50, but 33.33r% of 75.... so...


Subtract 50% from 100 gives you 50.
Add 50% to 50 gives you 75.
To get from 50 to 100 you have to add 100%.

If GPU X is 50 FPS and GPU Y 75 FPS the difference is not 25%, its 50%, the 50 FPS GPU needs to be half as fast again (50% faster) to match the 75 FPS GPU.

The easiest way to do this is to divide the faster GPU by the slower GPU, so you divide 75 FPS by 50 FPS (75 / 50 = 1.5) the .X, in this case .5 is your percentage. You can qualify it by adding your 50% result to the 50 FPS GPU, 50 + 50% = 75 FPS.

Its really simple :)

If after reading that you still think a 75 FPS GPU is 25% faster than a 50 FPS GPU i'm at a loss to help you.
 
@james.miller what you're talking about is a "percentage points difference" not a "percentage gain" which is what you need to use when calculating how much performance GPU X needs to gain to compete with GPU Y.
 
Not sure where the 11.4% comes from, surely the 2070 Super is 5% better than the 5700XT?
The easiest way look at it is like this. Look at whatever card has the 100% score..
The 5700xt had 44% of the performance of the 3090 Eagle oc and the 2070super had 49% of the performance of it.
So if we assume the 3090 had 100fps its easy to deduce that the 5700xt would have had 44fps and the 2070s 49fps.
Then just compare the 5700xt to 2070s with (44/49)x100 which is 89.7 or we can say the 5700xt had 89.7% the performance of the 2070super. Just shy of 11% slower.
 
Status
Not open for further replies.
Back
Top Bottom