• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
If you look at sales figures from Mindfactory the 5700 XT keeps up sales with the 2070 super and sold more than the 2070 did back in july 2019. The 2080 Ti has been around 1200-1600 sold per 10000 5700 XT's sold roughly. So how is it then possible for the 2080ti to have a higher share on Steam survey unless the survey isn't equally done on all machines resulting in a garbage and unusable result?

Sure we are told they have all of the marketshare..
 
Really not sure what the objections are here.
My objections are at a conceptual level, the dangerous precedent it sets, the hypocrisy of the "PC Master Race" and the fuelling of rampant fanboiism and blind marketing acceptance it creates.

I've already derailed the thread enough, and I certainly don't have the time nor inclination to post in the foetid soup that is the DLSS thread.
 
The so called Big Navi that they apparently dropped the price on, that 16GB card, 16GB, is obviously 256Bit, its at least cut down, IMO its unlikely to be Big Navi.
.

How is this calculated? How do people know that "x" bit bus must mean this much RAM?

I do find it ironic,PC gamers on many forums slagged consoles off for "cheating" with upscaling last generation,but now PC has it,its the "in" thing! Welcome to our new upscaling overlords!

:p
It is interesting watching a shift from rendering games at a higher res then downscaling it to match the monitor for better image quality, to all this upscaling stuff and close enough to the same image quality.

If you look at sales figures from Mindfactory the 5700 XT keeps up sales with the 2070 super and sold more than the 2070 did back in july 2019. The 2080 Ti has been around 1200-1600 sold per 10000 5700 XT's sold roughly. So how is it then possible for the 2080ti to have a higher share on Steam survey unless the survey isn't equally done on all machines resulting in a garbage and unusable result?

V7Ll-qN6c9u7J2xW9n_5q5AusmeB6dCZZdnP8MVraAA.png

the 2

The 2080ti was on sale for about a year longer than the 5700xt.
 
It is interesting watching a shift from rendering games at a higher res then downscaling it to match the monitor for better image quality, to all this upscaling stuff and close enough to the same image quality.

Managing expectations! :P

If you look at sales figures from Mindfactory the 5700 XT keeps up sales with the 2070 super and sold more than the 2070 did back in july 2019. The 2080 Ti has been around 1200-1600 sold per 10000 5700 XT's sold roughly. So how is it then possible for the 2080ti to have a higher share on Steam survey unless the survey isn't equally done on all machines resulting in a garbage and unusable result?

V7Ll-qN6c9u7J2xW9n_5q5AusmeB6dCZZdnP8MVraAA.png

There were posts even a decade ago on other forums,pointing how the HD5000 series marketshare,seemed to not match AIB sales reported by JPR,etc.
 
How is this calculated? How do people know that "x" bit bus must mean this much RAM?


It is interesting watching a shift from rendering games at a higher res then downscaling it to match the monitor for better image quality, to all this upscaling stuff and close enough to the same image quality.



The 2080ti was on sale for about a year longer than the 5700xt.

Would take a lot more than a year to make that a possibility when you look at the difference in sales. Somewhere between 6 to 1 and 8 to 1 in favor of the 5700 xt.
 
Sure, that's what we all want. But if DLSS can more-or-less get us there, what's not to like? It doesn't look like either AMD are getting us to 4k@120 with RT this generation without these sorts of techniques.

I'll offer up 'whats not to like' its the fact it cannot be universally applied without developer input. Take a game that would really benefit from DLSS say Red Dead Redemption 2.... oh look no support despite there being a real need and gain for PC gamers.

Its not the technology per se its the application and how it can be used, same as SLI it needs to be handled at the hardware and driver level to work and presumably both AMD and Nvidia will need to fix that for chiplet designs of the future.
 
  • Fortnite
  • Death Stranding
  • F1 2020
  • Final Fantasy XV
  • Anthem
  • Battlefield V
  • Monster Hunter: World
  • Shadow of the Tomb Raider
  • Metro Exodus
  • Control
  • Deliver Us The Moon
  • Wolfenstein Youngblood
  • Bright Memory
  • Mechwarrior V: Mercenaries
https://www.rockpapershotgun.com/2020/09/01/confirmed-ray-tracing-and-dlss-games-so-far/
Is that it? Going by the way Grim5 went on about it one would think the list would be a lot bigger than that :p

The only game in that list I want to play is Final Fantasy 15 which I want to revisit and as far as I know DLSS is not even in that game, it was just the demo that got it? So why is it on that list? lol.
 
hmmmm... 3080 virtually identical performance to a 2080TI.

LagmST4.png





8X 32Bit Memory IC's, either 1GB for 8 GB or 2GB for 16GB, 8X 32Bit = 256Bit

A 384Bit bus would have 12 32Bit IC's, 12X 32Bit = 384Bit.
Lol. Do you really think that will end up being true? If it did Nvidia would get so much stick that AMD may indeed reach 50% market share by the end of the generation. Nah, I do not buy it.
 
NP ^^^ :)

Bullzoid had done a follow up vid especially for you to illustrate the folly in thinking it's 130-140W :p


I'm watching this now and already found a flaw in his reasoning RE: IPC.

He said he couldn't tell the difference.

RX 480: 2304 Shaders at 1120Mhz, mem bandwidth: 256GB/s
290X: 2816 Shaders at 1050Mhz, mem bandwidth: 320GB/s

TPU Performance 1440P:
480: 100%
290X: 101%

The 290X has 22% more shaders and 30% more mem bandwidth
The 480 is 7% higher clocked.

The 290X is a much more substantial GPU than the 480 and yet the performance is the same, the 480 has a higher IPC, substantially higher IPC.

I'll keep watching.....
 
The only game in that list I want to play is Final Fantasy 15 which I want to revisit and as far as I know DLSS is not even in that game, it was just the demo that got it? So why is it on that list? lol.
FFXV has DLSS (the rubbish 1.0 version). You can only enable it if you're at 4K though for some reason.
 
FFXV has DLSS (the rubbish 1.0 version). You can only enable it if you're at 4K though for some reason.
Excellent, will try it out soon then. Shame it is the rubbish 1.0 version though, but Grim5 said DLSS 1.0 was awesome, so maybe it is not so bad? :p:D
 
This comment didn't age well...wonder how many others wont....

My comments were sarcastic, next time I'll use the /S :)

Anyone who believe Ampere would be on 5nm was nuts, but that highlighted the level of junk rumors going around from both camps. RDNA 2 rumours have been all over the show too, we've had rumours saying it's going to beat the 2080ti by 50% (which makes it a RTX3090 competitor) and rumours saying it might be slower than the RTX3080 - when you see such a wide range of rumours like this, it means people are just making up ****.
 
@five8five Right i'm 7 to 8 minutes in and he's talking about the 140 Watt claims others have made as Including the CPU, Memory and all the rest.

I don't know where he got that from but its not from me, i removed the CPU and the GDDR6 to come up with that number. Its not clear yet what he is or not including with his 180 Watt's claim.
 
@five8five Right i'm 7 to 8 minutes in and he's talking about the 140 Watt claims others have made as Including the CPU, Memory and all the rest.

I don't know where he got that from but its not from me, i removed the CPU and the GDDR6 to come up with that number. Its not clear yet what he is or not including with his 180 Watt's claim.

@five8five 8m 20s: he just said 180 Watt's board power minus the CPU, so including the GDDR6, that's pretty much what i said, 180 Watts with the GDDR6 and board components, remove all that, so GPU only, about 140 Watts, exactly what i said. :)
 
Status
Not open for further replies.
Back
Top Bottom