• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
And this is how it works, its like the "Intel Inside" logo, OEM's are paid to put those stickers on there, Nvidia are using AMD's Adaptive Sync, Free Sync and then paying quite a lot of them to put the G-Sync logo on them, it makes it look like "G-Sync Everywhere" and Free Sync no where because AMD are not going to pay, and its you who are actually paying for it in the price of your GPU's.

A mate recently bought an OLED TV, G-Sync plastered all over the box, in the actual settings menu it says Free Sync 2. that's not on the box.

This is a very good point, ideally they would all be marketed as Adaptive Sync (and maybe shown as "validated" in the specs for FreeSync / GSync), in reality FreeSync and GSync are both vendor implementations of VESA Adaptive sync.
 
I find this sort of behaviour quite predatory.

AMD came into a lot of criticism for the narrow Frame Rate range of Free Sync, quite rightly so, but they put a lot of work and no doubt money in the form of R&D into expanding that Frame Rate range and adding LFC tech and HDR.

Screen Vendors don't want to add $150 to the cost of the screen and given that Free Sync in its second generation is actually quite good start to adopt it over G-Sync Modules in droves.

Nvidia adopt AMD's Standard, great, but then also start promoting their own products off the back of AMD's work and R&D spend. Deliberately to push AMD branding out of the market while at the same time actually using their tech.

This is why sometimes i make the argument that AMD when they have the better solution should just do what Nvidia always do, but if everyone did that, which in this case that's how it would be, then where would we be as consumers?
 
Q2 2022 launch window, in the worst case even Q4 2022?!

This is more than a year away from now, a year and a half or even two years after the Radeon RX 6800 series launch?!


The costs are skyrocketing, the end user prices are skyrocketing, the supply gets worse and worse...


What about any new game engines? Unreal Engine 5? New games, something worth it to buy these new GPUs for?
 
[QUOTE="4K8KW10,


The costs are skyrocketing, the end user prices are skyrocketing, the supply gets worse and worse...


What about any new game engines? Unreal Engine 5? New games, something worth it to buy these new GPUs for?[/QUOTE]

Starfield is out Nov 2022

Its Skyrim in space :)
 
Starfield is out Nov 2022

Its Skyrim in space :)

I think we need something more like Crysis 4 with next-gen visuals, best in class physics and photorealism, and that puts something like the Radeon RX 6900 XT with Ryzen 9 5900X to run it at 3840x2160 medium settings at 30 FPS.
 
Starfield is out Nov 2022

Its Skyrim in space :)
Well, how ever much I may like Bethesda's open world games, they are hardly going to push the boundaries in terms of game effects. Morrowind/Oblivion/Skyrim were fairly impressive in term of a large open world but in terms of models, lighting, rendering? Nowhere near leading edge as far as I recall.
 
Well, how ever much I may like Bethesda's open world games, they are hardly going to push the boundaries in terms of game effects. Morrowind/Oblivion/Skyrim were fairly impressive in term of a large open world but in terms of models, lighting, rendering? Nowhere near leading edge as far as I recall.
Yeah graphics were bang average.
 
Yeah graphics were bang average.
Well, I wouldn't quite say that.
In Skyrim being able to be in Whiterun and look and see the Throat of the World is quite good. Or the opposite since with an invincible horse you could pretty much ride down from there to Whiterun.
However, with such a big wide open world they had to see too much LoD stuff.
I really admire what Piranha Bytes did with Gothic I and II. They made the world extra tough in certain areas so in the beginning you'd never dare go to certain areas, and then they cleverly used the terrain to limit the view distance. This allowed them to load the whole world seamlessly (the interior areas in all TES games are very immersion breaking). Maybe one day with very powerful cards and very clever LoD pre-renders a huge horizon will be possibly seamless, but for now being clever with the terrain is acceptable.
 
"Alleged AMD Radeon RX 7600 XT Navi 33 Specs Suggest It Could Be Faster Than The 6900 XT"

https://hothardware.com/news/amd-radeon-rx-7600-xt-navi-33-specs-faster-6900-xt

kind hard to believe a 150w-180w new card would beat the previous 320w flagship card - cause that's not just a 50%+ performance improvement but also a 50% performance per watt improvement.

As far as I'm aware there has never in the history of discrete GPUs being such a large single generation leap. Even with Nvidias famous Pascal architecture, the GTX1060 could not come close to beating the previous flagship, the GTX980ti
 
Last edited:
"Alleged AMD Radeon RX 7600 XT Navi 33 Specs Suggest It Could Be Faster Than The 6900 XT"

https://hothardware.com/news/amd-radeon-rx-7600-xt-navi-33-specs-faster-6900-xt

Reality time...

6600xt is basically a slower 5700xt in some titles and faster by 1-3% in others and costs more than an over 2 year old product... now a 6900xt is over double the performance of a 5700xt (over two year old product)... so now a 7600xt is faster than a 6900xt... of course :rolleyes::cry:... I will believe it when I see it.. it will again be 10-20% faster than a 5700xt and higher price again.


Keep feeding the clickbait sites with clicks.. and twitter users that make up rubbish.
 
Well, I wouldn't quite say that.
In Skyrim being able to be in Whiterun and look and see the Throat of the World is quite good. Or the opposite since with an invincible horse you could pretty much ride down from there to Whiterun.
However, with such a big wide open world they had to see too much LoD stuff.
I really admire what Piranha Bytes did with Gothic I and II. They made the world extra tough in certain areas so in the beginning you'd never dare go to certain areas, and then they cleverly used the terrain to limit the view distance. This allowed them to load the whole world seamlessly (the interior areas in all TES games are very immersion breaking). Maybe one day with very powerful cards and very clever LoD pre-renders a huge horizon will be possibly seamless, but for now being clever with the terrain is acceptable.

Unfortunately I lost my config files - forgot to backup the latest version and the stupid game resets the graphics settings on a GPU change. But I had the game modified to render high detail to the horizon with lots of grids loaded stable - was a right ******** to achieve and I could never replicate the settings even from the last backup :(

Posted a few screenshots in one of the Skyrim threads there.
 
Reality time...

6600xt is basically a slower 5700xt in some titles and faster by 1-3% in others and costs more than an over 2 year old product... now a 6900xt is over double the performance of a 5700xt (over two year old product)... so now a 7600xt is faster than a 6900xt... of course :rolleyes::cry:... I will believe it when I see it.. it will again be 10-20% faster than a 5700xt and higher price again.


Keep feeding the clickbait sites with clicks.. and twitter users that make up rubbish.

128bus as reaching 6900xt performance, wont surpass it but ballpark the same.
Maybe a reality check for you then
But I suspect your the expert here, right?
 
kind hard to believe a 150w-180w new card would beat the previous 320w flagship card - cause that's not just a 50%+ performance improvement but also a 50% performance per watt improvement.

As far as I'm aware there has never in the history of discrete GPUs being such a large single generation leap. Even with Nvidias famous Pascal architecture, the GTX1060 could not come close to beating the previous flagship, the GTX980ti

It's very much possible concidering the jump RDNA 1 to RDNA 2 nobody expected that leap in performance.

Anyway we just won't know and let the rumours keep on coming.
 
I wouldn't be surprised if that were true if we count RT being on. RDNA 2 is just really weak in that scenario so if they remedy that for RDNA 3 then it's not hard to believe it.
 
Status
Not open for further replies.
Back
Top Bottom