• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
I posted in the relevant thread but FidelityFX is pretty awesome on death stranding.

I compared all the possible resolution/AA and detail combinations.

You want to run the maximum resolution you can with FidelityFX on, HDR on, AA on.

The 5700XT just eats it all at 4K 60fps+.

i reckon the 5700Xt is good for 8k with FidelityFX in death stranding albeit at 30fps but my oled won’t be getting an upgrade until it breaks so i can’t test.
 
Why not just turn down some graphical settings in game and have (nearly) the same image quality in motion?
Serisouly, if we are satisfied with "nearly the same" you might as well just turn down the graphics.

Edit: Has anyone done a comparison of Native ultra everything vs Native turned down, vs DLSS ultra everything vs DLSS turned down?

For me DLSS (v2.0) based only on Control (haven't played Death Stranding), is something that I would use probably in every other game in which I would prefer higher settings and DLSS On than the opposite.

I would rather have RT stuff ON and DLSS ON, than RT OFF and DLSS OFF. My biggest problem with DLSS is that, from I've played, it wasn't supported for Surround setups (aka multimonitor), which is a buzz kill.
 
Specs for Arcturus MI100 CDNA according to AdoredTV

7680 cores @ 2700mhz
32gb 4096bit HBM2 memory
300w TDP

FP32 performance 13% faster than Ampere A100
 
Last edited:
I'm still inclined to wait for actual official figured before I believe anything about the upcoming releases from both sides.

My head is spinning from all the conflicting rumours that have been appearing on an almost daily basis.
That's CDNA. Not a whole lot mention on that other then it would be a 120CU card.


The hope is that we see, as rumored a 96 cu RDNA card. We've heard rumors of a 72CU and 80CU variants.
So far here is what is rumored:

RDNA:
96cu could be a 6900 xtx
80cu could be a 6900 xt
72cu could be a 6900
 
Last edited:
I
I posted in the relevant thread but FidelityFX is pretty awesome on death stranding.

I compared all the possible resolution/AA and detail combinations.

You want to run the maximum resolution you can with FidelityFX on, HDR on, AA on.

The 5700XT just eats it all at 4K 60fps+.

i reckon the 5700Xt is good for 8k with FidelityFX in death stranding albeit at 30fps but my oled won’t be getting an upgrade until it breaks so i can’t test.

Yeah, Fidelity FX in Death Standing looks really good. It will be interesting to see how AMD develop it further. The patch they added to it in May this year really improved it.
 
Just followed those links - it appears to be server components yes?

You are correct - they are commercial parts using the CDNA uarch. What is more interesting is that AMD has finally managed to make a GPU with over 64CUs,which apparently was a limit with GCN. So people are reading into that,hoping the same is true of RNDA2. My main concern,is not whether AMD can break the 64CU "limit" is whether the scaling looks any good,and whether their drivers again won't become a bottleneck.
 
RIP is a bit tough, 13% faster FP32 but 11% more cores and clocks 1000mhz higher. Nvidia may need to bring out the 8200 core full fat A100 and stick 24 pins on it lol

Clocks 1000mhz higher? In the table it states the clock might be 1334 MHz so lower than Ampere but to me that does not work out to get 25tflops. To get 25tf the clocks would have to be around 1530mhz with 8192 shaders.
 
I think it's a rad idea, particularly DLSS 3, which is supposed to work without pre-training.

What I want to see if the performance per transistor though, which is where the RTX cards regressed with the inclusion of RT and tensor cores, and the consumer picked up the cost. At the time DLSS 1.0 was pretty useless so the move was questionable IMHO. With DLSS 2.0 being very good, I can better understand the move now. But how many extra transistors, in the form of tensor cores (assumed), are required to get that DLSS 2.0 boost?

From this metric you can proper infer "value", where maybe the RTX cards cost a fortune, but if they're sporting 50% more transistors than the competition you might say "fair enough". AMD might choose not to compete with DLSS, but for the same transistor count you might get similar performance DLSS vs. no DLSS, or not.

For example, 5700xt 10.3B transitors, RTX2070S 13.6B transistors. Very similar performance w/o DLSS.

The idea of DLSS is fair enough my point is I don't want to pay for something I can't use. If say you play a lot of Red Dead R as an example then what use is DLSS if Rockstar don't enable it regardless of needing pre-training, will they go back and update GTA 5 so you can play at 4k probably not.

DLSS regardless of version is not a plug&play technology it requires developer support which lets be honest is a tough sell on games already in the wild and new titles being developed for console AMD hardware.
 
Think whatever makes you happy, I'm not going to get into a debate about whether you personally hate DLSS or not.

I'm completely serious though. Regardless of your own personal feelings about it, and even if you think DLSS is the crappiest thing ever, if AMD don't have a viable competitor to DLSS soon they are going to be missing out on a pretty important feature. Reviewers will show nvidia cards significantly outperforming their AMD equivalents with what to most "normal" non pixel peeping people (ie the mass market which is what really matters to AMD's bottom line) will look like incredibly similar visuals. I would bet that the majority couldn't even spot the difference between FidelityFX and DLSS which would almost invalidate my argument but for the fact that DLSS has a significantly bigger performance gain than FidelityFX which only reduces render resolution to around 75% vs DLSS for example upscaling 1440p to 4k.

They really need to show off some similar tech or they will lose out... that's it, that's all I'm saying. I'm not saying whether that's fair or whether the tech is the best thing ever... I think it is fair to say however that machine learning based upscaling is only going to get better over the coming years, not worse. I'd like to know for sure AMD is at least on the train rather than standing at the station.
I have to agree, AMD need to compete accordingly otherwise they will lose out big time especially as reviewers will showcase Nvidia in the best possible light aka 'Just buy it'!
 
DLSS is neither fake, nor cheating, if it provides high quality visuals at a high resolution and with a high framerate. It doesn't matter one little bit if it's rendering at a lower res and applying AI upscaling, and that's 'fake', if that AI upscaling is good and the end result works.
Yep, have to agree I don't care a jot how the final image is created all I care about is the nearness of it's approximation to the reality I see with my own eyes IRL. Criticising how an image is made seems pretty facile and pointless. The threat posed by DLSS to AMD is real whether we want to acknowledge it or not and hopefully AMD are already on the case with a similar iteration of their own.
 
I have to agree, AMD need to compete accordingly otherwise they will lose out big time especially as reviewers will showcase Nvidia in the best possible light aka 'Just buy it'!

It looks like a lot of our forum members are of the 'just buy it' mentally as well considering the Ampere thread currently has 6882 posts Vs only 3816 in this one. Many of them have already made up their mind about getting an Nvidia card.
 
It looks like a lot of our forum members are of the 'just buy it' mentally as well considering the Ampere thread currently has 6882 posts Vs only 3816 in this one. Many of them have already made up their mind about getting an Nvidia card.
Yep, you have to wonder wtf is happening in the World when people start having the same daft loyalty to a brand that people have to a football team. I guess it shows marketing makes morons out of an awful lot of people!
 
It looks like a lot of our forum members are of the 'just buy it' mentally as well considering the Ampere thread currently has 6882 posts Vs only 3816 in this one. Many of them have already made up their mind about getting an Nvidia card.
That's to be expected. You must remember this is just a subset. Admitted from their own camp.

One thing that is dafly omitted is the fact that we're not playing PC exclusive games anymore. While some have come into this platform having not known of PC exclusive games. Moving forward, the majority of games that you play are going to come from console.

And yet you have a few alarmist proclaiming dlss 2.0 when it's only been exhibited in just one game. Just one game LOL. There is no other announcement of any other game using dlss 2.0. Yet they want you to believe that AMD must take it seriously. It's like making a mountain out of a simple ant hill.

Furthermore, as mentioned by other users in this thread Nvidia is not a charity. What that means is they're not going to allow dlss from a 2000 series turing to be on par with ampere. They would never sell ampere, lol. That is the true focus of what the green camp needs to be concerned about. Not what they think AMD needs to do.:D

The truth is though, unfortunately, some of them really do believe that with dlss and their 2000 series card that they are going to be a few frames behind ampere. Which is a sad delusion.

The point of all this is that they will stay with Nvidia be an ampere or turing.


-‐--‐-----
Truth is AMD does have a few feature sets of their own which they will announce once the card is released. Another thing is their conjunction with Microsoft in which, from initial tes, allowed you to download games via power shell directly from console. No more .exe.

But let us not kid ourselves adrenaline drivers are feature-rich already.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom