• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
QGdiW2F.png


Posted above I've just seen. It's starting!
Yup, it was first posted by @docsonic

AMD need to keep dropping little hints like this right through until October.
 
So much hype around DLSS and RTX IO though I think people need assurances that AMD will have equivalent features.

No idea about DLSS but NVIDIA stated that they worked with Microsoft to add the API's for this to the OS (Direct storage access between PCIE devices is more of an OS & file-system change rather than a GPU driver change). That would mean that games would make use of the features more like they do normal windows API.

It will end up in both AMD and NVIDIA drivers
 
We already know they do, at least in some form. They have sharpening and IO technology in the consoles. It would be unfathomable of them not to replicate this in RDNA2 cards.
I do use the sharpening from 2160p down to 1800p and things look fine. Below that picture quality suffers too much on my 4k monitor. DLSS can provide an extra 50% performance on top of that, that's still a huge deal.
 
If AMD can do a Ryzen on Nvidia with the GPU's this might just tempt me to go AMD twice in a row something not done for years. I was going to sit this round out too.
 
To repeat what has been said before.

AMD already have a 56 CU RDNA2 GPU (52 Active) with RT, a form of DLSS, high speed memory caching technology, at 175 Watts with a 8 core 16 thread CPU at 360mm2.

That GPU is about as fast as a 2080TI and with the CPU taken out about 140 Watts and under 300mm2, if AMD can't at least match the 3080 with an 80 CU version of that something is very wrong and it would be far more power efficient (250 Watts) and around 400mm2.
 
This gets trotted out every single time... bare in mind the people claiming yield issues also claimed the 3080/90 would be on TSMC 7nm.

Ah I know take it with a pinch of salt but this year has been one helluva weird year anything could happen. Its like someone else said if 100,000 units hit the UK on 17th September and there are 200,000 waiting customers then there will be no stock left.
 
I do use the sharpening from 2160p down to 1800p and things look fine. Below that picture quality suffers too much on my 4k monitor. DLSS can provide an extra 50% performance on top of that, that's still a huge deal.

Nah, 10 games may have it, 99.999999% dont so its not even a consideration
 
The problem is that a DLSS-alternative would still mostly likely require per-game implementation, and that means it will be excluded from Nvidia games which already feature DLSS. Most importantly Nvidia will still have an insurmountable advantage in Cyberpunk 2077 (and a host of other upcoming major releases this holiday season), and I don't think most people are going to be happy giving that up. I know I sure as heck wouldn't.
 
Honestly, I think I'll probably just bend over to NVIDIA. NVIDIA are ahead in VR implimentations, still I haven't seen any evidence of AMD's RTX tech in action on PC, DLSS equivalent from AMD isn't here... and Nvidia's sadly is.

Both companies have a lot of work to do on making developers adopt their respective technologies.

I think both GPUs released this year will end up being short term cards. They won't last the distance as GPU equirements (both in speed and VRAM) will likely increase in the next 24 months. Where the AMD cards might fall is speed as I don't see them personally matching the 3080 given they still don't have a 2080ti equivalent card.

I hope I'm wrong.

The games which do use DLSS 2.0 will obviously destroy the AMD equivalents and thats sad. If NVIDIA can continue to support game developers and splash the cash on DLSS, they will win versus AMD but its a big if as wide spread adoption of DLSS 2.0 hasn't been there.
 
FYI I'm still waiting. If AMD drop a 3070 equipvalent card (which is what I'm expecting) with 16GB VRAM, should change pricing of the 3080 and 3070 a little. Also we're all buying new GPUs for Cyperpunk.. why not wait till its released?
 
The problem is that a DLSS-alternative would still mostly likely require per-game implementation, and that means it will be excluded from Nvidia games which already feature DLSS. Most importantly Nvidia will still have an insurmountable advantage in Cyberpunk 2077 (and a host of other upcoming major releases this holiday season), and I don't think most people are going to be happy giving that up. I know I sure as heck wouldn't.

Not necessarily, depends on how Consoles want to handle resolution scaling, i'm sure they will want an API level version of it.

RDNA1 already does a pretty good job of it without the use of special hardware, i use it globally, it just needs that extra 5% to bring it in line with DLSS2.
 
FYI I'm still waiting. If AMD drop a 3070 equipvalent card (which is what I'm expecting) with 16GB VRAM, should change pricing of the 3080 and 3070 a little. Also we're all buying new GPUs for Cyperpunk.. why not wait till its released?
Of course every logical person is waiting and for many of us the 3080 is dead on arrival with its 10gb vram.
 
Yeah wouldn't pay £650 for a 10GB card no matter how fast it is, i don't think my 5700XT has enough with 8GB, i don't need more GPU muscle, i need more VRam.

according to Coreteks its a 3070 competitor

Coreteks is an idiot, AMD are not charging £150 more than Nvidia for the same level card, if they are they might as well not bother.
 
Not necessarily, depends on how Consoles want to handle resolution scaling, i'm sure they will want an API level version of it.

RDNA1 already does a pretty good job of it without the use of special hardware, i use it globally, it just needs that extra 5% to bring it in line with DLSS2.

Right. So again we dont need to reiterate RDNA2 dGPU is going to be a nice alternative in the space above the 5700's and a 3080. Even better will be if people dont want RTX or IO hype they can pick up this in a brand new card supporting HDMI 2.1 with 16Gb vram for less? :)
 
Not necessarily, depends on how Consoles want to handle resolution scaling, i'm sure they will want an API level version of it.

RDNA1 already does a pretty good job of it without the use of special hardware, i use it globally, it just needs that extra 5% to bring it in line with DLSS2.

Idk that I'd expect uniformity even then. On PS they had their own hardware for checkerboard rendering, but wasn't available on Xbox, and then wasn't ported to PC builds always. Now for next-gen Xbox sort of talked about some ML-assisted upscaling but didn't elaborate, and Sony said nothing and AMD says nothing also.

Let's hope they figure out a global solution though, I'd be happy.
 
Right. So again we dont need to reiterate RDNA2 dGPU is going to be a nice alternative in the space above the 5700's and a 3080. Even better will be if people dont want RTX or IO hype they can pick up this in a brand new card supporting HDMI 2.1 with 16Gb vram for less? :)

I'll pay £400 for a card a bit faster than my 5700XT with 16GB of GDDR6, Ray Tracing is a bonus but it will have that.

On Coreteks, he's clearly not thinking straight, the consoles are a competitor to a 3070. What is he saying? That AMD cannot better their Console APU's?
 
Idk that I'd expect uniformity even then. On PS they had their own hardware for checkerboard rendering, but wasn't available on Xbox, and then wasn't ported to PC builds always. Now for next-gen Xbox sort of talked about some ML-assisted upscaling but didn't elaborate, and Sony said nothing and AMD says nothing also.

Let's hope they figure out a global solution though, I'd be happy.

I'm happy with the solution that are in my drivers right now, it is 95% of DLSS2 and its switch on globally and forget about it.
 
I'll pay £400 for a card a bit faster than my 5700XT with 16GB of GDDR6, Ray Tracing is a bonus but it will have that.

On Coreteks, he's clearly not thinking straight, the consoles are a competitor to a 3070. What is he saying? That AMD cannot better their Console APU's?

I know, and its gonna be more efficient than your 5700XT with all the RDNA2 brucies. So even if they had stood still (which they havnt) by facerolling lets call it the 6700XT it would still come out 25% better and we know they will have pushed the hardware like they did for RDNA1..
 
Status
Not open for further replies.
Back
Top Bottom