• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
The world went downhill when corona hit transport and manufacturing, substrates wafers etc..
That with the success AMD has, beating Intel and Nvidia especially with upcoming RDNA3 and the demand from gamers and also miners just shows how a market can change. without the recent pandemic and mining hype we would been able to buy cards just fine as we now can buy Ryzen cpus.
and even now the prices on Ryzen are slowly adjusting (lowering) due to demand isn't outselling shops storage.

its just the prices has gone up overall and I don't think AMD will lower them as we seen with CPUs and soon it be like that with GPUs and the RDNA3 line up.
Make no sense when your the leader to lower prices

They certainly beat Intel, but they are still a generation behind Nvidia. I hope it changes with RDNA3, but I can't see it with DLSS appearing in more engines and Nvidia's RT cores being so far ahead in performance. Remember RDNA2 is already on a smaller node than Ampere, 7 vs 8nm, and has up to a 50% increase in clock speed.
 
They certainly beat Intel, but they are still a generation behind Nvidia. I hope it changes with RDNA3, but I can't see it with DLSS appearing in more engines and Nvidia's RT cores being so far ahead in performance. Remember RDNA2 is already on a smaller node than Ampere, 7 vs 8nm, and has up to a 50% increase in clock speed.

Nvidia have to catch up with Raster performance - AMD is ahead here; DLSS is a gimmick of lack of horsepower and DXR needs another generation or 2 from both companies to make it worthwhile - HUB are right on this.
 
They certainly beat Intel, but they are still a generation behind Nvidia. I hope it changes with RDNA3, but I can't see it with DLSS appearing in more engines and Nvidia's RT cores being so far ahead in performance. Remember RDNA2 is already on a smaller node than Ampere, 7 vs 8nm, and has up to a 50% increase in clock speed.

RT is irrelevant to the large majority of pc gamers who have no Interest in it

And DLSS while a boost has its own annoying issues
 
They certainly beat Intel, but they are still a generation behind Nvidia. I hope it changes with RDNA3, but I can't see it with DLSS appearing in more engines and Nvidia's RT cores being so far ahead in performance. Remember RDNA2 is already on a smaller node than Ampere, 7 vs 8nm, and has up to a 50% increase in clock speed.
I don't play a single game that uses dlss or looks to use it any time soon, would that mean amd to me is still a generation behind?

The sad truth is to add is that a lot of killer titles are on the ps5, to which I luckily have.
 
RT is irrelevant to the large majority of pc gamers who have no Interest in it

And DLSS while a boost has its own annoying issues
I don't play a single game that uses dlss or looks to use it any time soon, would that mean amd to me is still a generation behind?

The sad truth is to add is that a lot of killer titles are on the ps5, to which I luckily have.

If you don't want next gen visuals even with so many new titles supporting it and with the huge cost saving, AMD is a good choice.

Nvidia have to catch up with Raster performance - AMD is ahead here; DLSS is a gimmick of lack of horsepower and DXR needs another generation or 2 from both companies to make it worthwhile - HUB are right on this.

Even HUB are coming around now that they see the support for both RT and DLSS rising.

Are you three hoping AMD also have poor RT support and no serious competition to DLSS in RDNA3?
 
Prices need to come down again especially in the midrange where we are now paying more for less performance than was on offer 2 years ago.

Also AMD need to win market share and mindshare as many will just buy into nvidia no matter what the performance looks like, I haven't owned Radeon card since 2006 so AMD need to tempt me with great performance and prices to match if they want me to ditch nvidia. If they are just content to price match nvidia then I and many others will just continue to buy nvidia.

Its why e-sport players use Ryzen cpus now and radeon cards.

They certainly beat Intel, but they are still a generation behind Nvidia. I hope it changes with RDNA3, but I can't see it with DLSS appearing in more engines and Nvidia's RT cores being so far ahead in performance. Remember RDNA2 is already on a smaller node than Ampere, 7 vs 8nm, and has up to a 50% increase in clock speed.

Rasterization is what matters unless you buy into the marketing from nvidia that is but then you cant think logically.

Nvidia have to catch up with Raster performance - AMD is ahead here; DLSS is a gimmick of lack of horsepower and DXR needs another generation or 2 from both companies to make it worthwhile - HUB are right on this.

FPS shooters, e-sport etc..why use distortion? (dlss) or ray tracing.

RT is irrelevant to the large majority of pc gamers who have no Interest in it

And DLSS while a boost has its own annoying issues

RT will be a good thing one day.
Technology takes time to evolve so does games and engines.
we are not even fully dx12 yet after a decade.

RDNA3 as it seems to be tile based may be the step up Ryzen was from Bulldozer but then amd enhanced each generation with a 15% ipc lifts and now adding 3DV stacking to an upgraded zen3.
So, people said RDNA2 wouldn't be good and then surpassed Nvidia.
RDNA3 may be the cake on the ice
 
No, HUB are not coming around at all just today they said DXR needs another generation with titles needing it baked in rather than an addon. Remember the 130 titles and applications Nv PR talk about , is 100+ applications which can leverage the tensor cores in some way and the rest game yitles, which mostly use upscaling. Nv are also threatened by Tenstorrent and its incredible ML which is massively faster.
 
Its why e-sport players use Ryzen cpus now and radeon cards.
Are you sure they use radeon cards? From what I've seen benched on YT, Nvidia has lower latency.

Rasterization is what matters unless you buy into the marketing from nvidia that is but then you cant think logically.
I have no issue with people clinging to rasterisation, but AMD also market their cards as having RT support proving it's not just marketing from Nvidia. What about Intel's up coming GPUs that will support both RT and AI?

FPS shooters, e-sport etc..why use distortion? (dlss) or ray tracing.
DLSS is great for making your system run cooler while also having the option of higher FPS. I think that's why AMD is working on FSR? And of course RT just gives far better IQ/immersion, which I conceed isn't a requirement for everyone playing whack a mole.

RT will be a good thing one day.
Technology takes time to evolve so does games and engines.
we are not even fully dx12 yet after a decade.
We are nearing 3 years with hardware RT and DLSS. It is already good, very good! - https://www.overclockers.co.uk/foru...ames-benchmarks-software-etc-thread.18898329/. Even the recent E3 had quite a few titles with both DLSS and RT on display.

RDNA3 as it seems to be tile based may be the step up Ryzen was from Bulldozer but then amd enhanced each generation with a 15% ipc lifts and now adding 3DV stacking to an upgraded zen3.
So, people said RDNA2 wouldn't be good and then surpassed Nvidia.
RDNA3 may be the cake on the ice
I really hope RDNA3 delivers as the R9 290 was my last AMD card, but RDNA2 is still a generation behind. I kept my eye on the 6800XT, while I waited for my 3080.
I kept the 3080.
 
If you don't want next gen visuals even with so many new titles supporting it and with the huge cost saving, AMD is a good choice.



Even HUB are coming around now that they see the support for both RT and DLSS rising.

Are you three hoping AMD also have poor RT support and no serious competition to DLSS in RDNA3?

not hoping for it but for esports titles and strategy DLSS and RT are pretty useless and they make up the majority of pc games sales



we will see how amds competition to DLSS is next week but dont want to get into an argument on FSR


as the next gen consoles evolve Iam certain we will see games favouring AMD tech more and more

a good example is Warzone with AMD out performing the Nvidia cards with RT turned on even if you use dlss god knows how with RT
 
With FSR being more developer friendly I'm pretty sure it will be a lot better than DLSS 1.0 which was a resounding failure. It would be.much better to play games without DLSS though wouldn't it? We got so many people that think they need to play games at 120fps but they can't see the difference in quality when they use DLSS?
 
not hoping for it but for esports titles and strategy DLSS and RT are pretty useless and they make up the majority of pc games sales



we will see how amds competition to DLSS is next week but dont want to get into an argument on FSR


as the next gen consoles evolve Iam certain we will see games favouring AMD tech more and more

a good example is Warzone with AMD out performing the Nvidia cards with RT turned on even if you use dlss god knows how with RT

Never played Warzone. There doesn't seem to be much between the GPUs ~4min -

I'd love to see AoE 4 supporting RT. The Riftbreaker supports RT shadows, bit limited but better than nothing.
 
Last edited:
With FSR being more developer friendly I'm pretty sure it will be a lot better than DLSS 1.0 which was a resounding failure. It would be.much better to play games without DLSS though wouldn't it? We got so many people that think they need to play games at 120fps but they can't see the difference in quality when they use DLSS?
I use DLSS at 1440p/60 as It keeps my system both cool and quiet. Playing Control at the moment with everything maxed and DLSS Quality. My 3080 sits at 51C, ambient 22C.

 
Why are you playing at 900p upscaled? Nvidia need to produce a card that can actually render properly at the output res. I play games downscaled from 6k output at 4k,
I use DLSS at 1440p/60 as It keeps my system both cool and quiet. Playing Control at the moment with everything maxed and DLSS Quality. My 3080 sits at 51C, ambient 22C.

I still hit 1440p/60 without DLSS. It's that good, why wouldn't you? :D
 
Because DLSS still has problems . And 1440/60 - is that an RTX 3060 or something? Enjoy Control, for a tech demo it looks `interesting`.

No 1440p/60 is my panel, 2560x1440 @ 60Hz. I use an ASUS 3080 TUF OC.

I didn't like Control when it first came out, but enjoying it now. The destructable environments are excellent.
 
No 1440p/60 is my panel, 2560x1440 @ 60Hz. I use an ASUS 3080 TUF OC.

I didn't like Control when it first came out, but enjoying it now. The destructable environments are excellent.

Its my `sort of game` , for a shooter needs to be faster (which makes dxr pointless, just ask the fortnite community about it - they laugh) , and dlss has problems at high refresh rates. for me its either 1080/144 or 4k/60 (rendered at 6k and downscaled) depends on game
 
DLSS has the same issue as GSync and i think it will end in same way.

Its a closed system and the industry tends to move to generic / standard implementations (however slowly). See Adaptive Sync (FreeSync) and Vulkan (mantle)
 
DLSS has the same issue as GSync and i think it will end in same way.

Its a closed system and the industry tends to move to generic / standard implementations (however slowly). See Adaptive Sync (FreeSync) and Vulkan (mantle)

Well Gsync is still a thing and I may end up buying one now that I read AMD can run their adaptive sync on Gsync monitors. Freesync still has issues at low frame rates from what I read.

DLSS can die when the industry comes up with something better, much in the same way as Gsync.
 
Status
Not open for further replies.
Back
Top Bottom