• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Just to provide an analogy, I used to own an AW2721D 240hz 1440p monitor and I enjoyed gaming in it for nearly 2 years but the HDR performance was decidedly subpar since it was an edge lit panel. Now I have a Samsung Odyssey Neo G8 and an AW3423DW and seeing HDR in action, playing at 1440p without it is now unplayable to me. That’s called tech advancement. Now both these panels have their flaws. The AW3423DW cannot get bright enough with small highlights without triggering ABL while the Samsung gets very bright but doesn’t look as good in the dark due to haloing.That’s why I keep both. In the future, there will be panel technologies which could do both at the same time so that would make both monitors I own right now seem very bad in comparison.

So I don’t get your quote here. 4K DLSS performance RT Ultra 60 fps Cyberpunk was playable. But tech advancement now means I get the same experience at higher frame rates with DLSS Quality. But now the 4090 isn’t powerful enough to use DLDSR and RT Psycho settings in Cyberpunk. The 5090 will probably do that which will make the 4090 experience seem bad at that time. This is tech advancement.

Let's hope the current trend doesn't continue and get another price hike on 5090, even at some point the Nvidia fanatics might start giving it a second glance wondering how far it needs to go

Anyway don't care for that such product but feel like below that things hopefully get better with AMD competing and making the 4080 16gb look silly
 
Last edited:
Just to provide an analogy, I used to own an AW2721D 240hz 1440p monitor and I enjoyed gaming in it for nearly 2 years but the HDR performance was decidedly subpar since it was an edge lit panel. Now I have a Samsung Odyssey Neo G8 and an AW3423DW and seeing HDR in action, playing at 1440p without it is now unplayable to me. That’s called tech advancement. Now both these panels have their flaws. The AW3423DW cannot get bright enough with small highlights without triggering ABL while the Samsung gets very bright but doesn’t look as good in the dark. In the future, there will be panel technologies which could do both at the same time so that would make both monitors I own right now seem very bad in comparison.

So I don’t get your quote here. 4K DLSS performance RT Ultra 60 fps Cyberpunk was playable. But tech advancement now means I get the same experience at higher frame rates with DLSS Quality. But now the 4090 isn’t powerful enough to use DLDSR and RT Psycho settings in Cyberpunk. The 5090 will probably do that which will make the 4090 experience seem bad at that time. This is tech advancement.
But why praising a game if you don't get the best experience? What were you doing you stopped playing it and took screenshots at 5 FPS RT Psycho? Why not saying : the game runs like crap and i will wait for better hardware before playing it.
Why it was awesome back then but now the same level of performance is awful?
To be fair i played it for 30 minutes and that was it. I didn't liked the game and won't play it on any hardware.
 
But why praising a game if you don't get the best experience? What were you doing you stopped playing it and took screenshots at 5 FPS RT Psycho? Why not saying : the game runs like crap and i will wait for better hardware before playing it.
Why it was awesome back then but now the same level of performance is awful?
To be fair i played it for 30 minutes and that was it. I didn't liked the game and won't play it on any hardware.

I bought the game on release played 30 mins and haven't since lol just not my type of game
 
People will be able to enjoy CP as it is now with rdna 3, which is great, it is still one of the best, if not the best looking games with the "current" RT visuals...

The problem is the upcoming patch where a **** ton of the RT is being dialled up (like I have always said, what we have seen with likes of cp in its current form is literally just the tip of the iceberg of what RT can provide, it is being held back considerably as we still haven't got the hardware nor even software solutions to get us to where we ultimately want to be), also, the difference between the 40xx and rdna 3 could very well even be much bigger when that patch lands or/and more games are optimised using nvidias new SER or whatever it's called rendering methods which apparently improves performance by up to 43/48% for 40xx hardware.....

Ultimately, we need to start seeing more metro ee kind of games but again, that won't happen until old gen consoles are dumped. The "we don't have the budget option for RT" argument has been debunked many times already as shown by metro ee on current gen consoles and weaker/older RT capable gpus.

Essentially it is just going to be rdna 2 vs ampere all over again, games will now start pushing rt a bit more with the new improved hardware we have and rdna 3 as well as ampere will just simply need settings reduced compared to the 4080/4090 over the next 2 years.
That is dreaming to be honest. The only games pushing RT will be sponsored by Nvidia and most likely run like crap on your old card anyway. Because they don't want you to have the best visuals without opening your wallet every time.
And the truth is we can have almost an infinite number of overdrive mods. And each of them will look great in Nvidia promotion materials when compared with RT off. You just have to increase the number of samples per pixels or bounces and the game becomes unplayable on the 3090, the 4090, the 5090 and so on.
 
Just to provide an analogy, I used to own an AW2721D 240hz 1440p monitor and I enjoyed gaming in it for nearly 2 years but the HDR performance was decidedly subpar since it was an edge lit panel. Now I have a Samsung Odyssey Neo G8 and an AW3423DW and seeing HDR in action, playing at 1440p without it is now unplayable to me. That’s called tech advancement. Now both these panels have their flaws. The AW3423DW cannot get bright enough with small highlights without triggering ABL while the Samsung gets very bright but doesn’t look as good in the dark due to haloing.That’s why I keep both. In the future, there will be panel technologies which could do both at the same time so that would make both monitors I own right now seem very bad in comparison.

So I don’t get your quote here. 4K DLSS performance RT Ultra 60 fps Cyberpunk was playable. But tech advancement now means I get the same experience at higher frame rates with DLSS Quality. But now the 4090 isn’t powerful enough to use DLDSR and RT Psycho settings in Cyberpunk. The 5090 will probably do that which will make the 4090 experience seem bad at that time. This is tech advancement.

I don't understand how two years later Cyberpunk still excites people. How many hours do you play on it? I must be missing a trick because I played it on a 3070fe on its release and haven't felt remotely interested in putting more time into it.
 
That is dreaming to be honest. The only games pushing RT will be sponsored by Nvidia and most likely run like crap on your old card anyway. Because they don't want you to have the best visuals without opening your wallet every time.
And the truth is we can have almost an infinite number of overdrive mods. And each of them will look great in Nvidia promotion materials when compared with RT off. You just have to increase the number of samples per pixels or bounces and the game becomes unplayable on the 3090, the 4090, the 5090 and so on.
Nvidia does this all the time. They sponsor a tech demo game which only runs OK on their new generation and not so great on the previous one. Atomic Heart is the Cyberpunk 2077 for the Ada Lovelace generation. Also it was quite clear Cyberpunk 2077 had very poor rasterised effects to emphasis RT on settings. Another old Nvidia trick. They did it with PhysX. Hobbled the normal CPU run game physics, used X87 to hobble the CPU fallback layer and over emphasise the PhysX physics to make it more obvious. It's like what those dodgy salesman do when they have two TVs and try and sell you the £100 HDMI cable by fiddling with the TV settings or video quality.
 
I don't understand how two years later Cyberpunk still excites people. How many hours do you play on it? I must be missing a trick because I played it on a 3070fe on its release and haven't felt remotely interested in putting more time into it.

It's a crap game, it's just a performance benchmark. I hope they don't screw up the new Witcher game like this
 
Last edited:
Let's hope the current trend doesn't continue and get another price hike on 5090, even at some point the Nvidia fanatics might start giving it a second glance wondering how far it needs to go

Anyway don't care for that such product but feel like below that things hopefully get better with AMD competing and making the 4080 16gb look silly
graphic cards are going to be more expensive, that looks like the general trend.. and i dont expect a regression in power consumption
needs something radical to buck the trend
what might instead happen is that 5070 tier card might comfortably perform on 4K metrics, so it would be upto the buyer to decide on a more economical upgrade path
 
But why praising a game if you don't get the best experience? What were you doing you stopped playing it and took screenshots at 5 FPS RT Psycho? Why not saying : the game runs like crap and i will wait for better hardware before playing it.
Why it was awesome back then but now the same level of performance is awful?
To be fair i played it for 30 minutes and that was it. I didn't liked the game and won't play it on any hardware.
But it was a good experience. I have sunk 240 hours in my save file playing at 4k DLSS performance with RT and I was mostly getting 60 FPS all the time. It wasn't 5 FPS. Did you see the benchmarks of a 3080 Ti in Cyberpunk?

Its no longer a good experience for me because the next generation has now given me a way to turn on RTGI in the game AND play at DLSS Quality whilst simultaneously getting much higher FPS than before. My experience with the game is much more enjoyable than before. Could it be improved further? Yes as I would love to use DLDSR and downscale from 1.78x to get a pristine clear image which looks better than native 4k. I love games which push graphical boundaries as it gives me a reason to buy new hardware and enjoy it. That does not invalidate the good experience I had with the game at lower settings with the hardware available at the time.

Crysis developers in an interview said that they wanted to push their remastered games well beyond what current gen hardware could do but decided not to do it because gamers with high end GPUs started getting mad that their 1k GPU could not max out the game. I do not follow that line of thinking.
 
I don't understand how two years later Cyberpunk still excites people. How many hours do you play on it? I must be missing a trick because I played it on a 3070fe on its release and haven't felt remotely interested in putting more time into it.
I still ain't played it due to the GPU crisis. Bought on release for full whack, performance too bad to play on current PC. Been waiting for new PC now for years.

Refuse to pay prices we're currently charged for GPUs :D
 
Last edited:
I don't understand how two years later Cyberpunk still excites people. How many hours do you play on it? I must be missing a trick because I played it on a 3070fe on its release and haven't felt remotely interested in putting more time into it.
The RPG mechanics and interactivity of the world are very poorly done and seem incomplete. NPC AI is worse than Bethesda games. The game seems to lack a variety of character models and seems to use very few voice actors for NPCs. In many structural levels it's a worse game then the Witcher 3. People who just play COD seem to think its OK because it looks shiny.

If you look at the RPG aspects CDPR were emphasising pre-launch it does not come close.
 
Last edited:
That is dreaming to be honest. The only games pushing RT will be sponsored by Nvidia and most likely run like crap on your old card anyway. Because they don't want you to have the best visuals without opening your wallet every time.
And the truth is we can have almost an infinite number of overdrive mods. And each of them will look great in Nvidia promotion materials when compared with RT off. You just have to increase the number of samples per pixels or bounces and the game becomes unplayable on the 3090, the 4090, the 5090 and so on.

Not true. Plenty of amd sponsored games are pushing RT too, just having to hold back the quality because they can't handle it as well as nvidia do. Intel will also be pushing sponsored games to use RT and even without sponsorships, we'll see RT being pushed, it's just the nature of new tech and the next step for developers.

Agree though, nvidia will be upping the effects as obviously like any business they will want to sell the advantage of their hardware, same way amd will now want to sell the advantage of their 20+GB vram and dp 2.1 "8k" displays but those kind of titles won't be too often i.e. cp 2077 overdrive mode, portal rtx (similar to quake rtx implementation). Of course, this is where DLSS 3/FG will also be sold.

I don't understand how two years later Cyberpunk still excites people. How many hours do you play on it? I must be missing a trick because I played it on a 3070fe on its release and haven't felt remotely interested in putting more time into it.

176 hours here :)

Holding of continuing my 3rd play through until the next patch (will be my last play through). Regardless of visuals, one of my favourite games over the past few years, love the game world design (and how the city felt lived in/alive, always my biggest complaints with open world games, they just feel dead, whcih is immersion breaking for me) , characters etc. but I'm a big fan of films like fifth element, blade runner etc. so naturally this game has a lot in common with those style of films.

Nvidia does this all the time. They sponsor a tech demo game which only runs OK on their new generation and not so great on the previous one. Atomic Heart is the Cyberpunk 2077 for the Ada Lovelace generation. Also it was quite clear Cyberpunk 2077 had very poor rasterised effects to emphasis RT on settings. Another old Nvidia trick. They did it with PhysX. Hobbled the normal CPU run game physics, used X87 to hobble the CPU fallback layer and over emphasise the PhysX physics to make it more obvious. It's like what those dodgy salesman do when they have two TVs and try and sell you the £100 HDMI cable by fiddling with the TV settings or video quality.

Wouldn't say that tbh, I played first time on my vega 56 so no RT and even then the lighting, shadows, reflections all looked fantastic and far better than other games, still does today too.
 
Last edited:
That is dreaming to be honest. The only games pushing RT will be sponsored by Nvidia and most likely run like crap on your old card anyway. Because they don't want you to have the best visuals without opening your wallet every time.
And the truth is we can have almost an infinite number of overdrive mods. And each of them will look great in Nvidia promotion materials when compared with RT off. You just have to increase the number of samples per pixels or bounces and the game becomes unplayable on the 3090, the 4090, the 5090 and so on.
Who says only Nvidia sponsored games have RT. Spiderman Remastered? The entire Resident evil remastered series? Far Cry 6? Elden Ring (will get an RT Patch soon) Forza Horizon 5?. Its not pushing the limits of RT hardware but any time you turn it on, the Nvidia hardware will barely be impacted while the AMD hardware will regress to 3000 series of performance.
 
Who says only Nvidia sponsored games have RT. Spiderman Remastered? The entire Resident evil remastered series? Far Cry 6? Elden Ring (will get an RT Patch soon) Forza Horizon 5?. Its not pushing the limits of RT hardware but any time you turn it on, the Nvidia hardware will barely be impacted while the AMD hardware will regress to 3000 series of performance.

buuut you need to put in extra £700 for it ? or just use FSR to get 60 fps with RT ? and have close raster to 4090 for £700 less ?
that being said I wouldnt pay £1700 for a GPU let alone £1K, I prefer value per buck in the £600-£750 range maybe next generation :)
 
Last edited:
Who says only Nvidia sponsored games have RT. Spiderman Remastered? The entire Resident evil remastered series? Far Cry 6? Elden Ring (will get an RT Patch soon) Forza Horizon 5?. Its not pushing the limits of RT hardware but any time you turn it on, the Nvidia hardware will barely be impacted while the AMD hardware will regress to 3000 series of performance.
FSR is better than RT
 
Not true. Plenty of amd sponsored games are pushing RT too, just having to hold back the quality because they can't handle it as well as nvidia do. Intel will also be pushing sponsored games to use RT and even without sponsorships, we'll see RT being pushed, it's just the nature of new tech and the next step for developers.

Agree though, nvidia will be upping the effects as obviously like any business they will want to sell the advantage of their hardware, same way amd will now want to sell the advantage of their 20+GB vram and dp 2.1 "8k" displays but those kind of titles won't be too often i.e. cp 2077 overdrive mode, portal rtx (similar to quake rtx implementation). Of course, this is where DLSS 3/FG will also be sold.



176 hours here :)

Holding of continuing my 3rd play through until the next patch (will be my last play through). Regardless of visuals, one of my favourite games over the past few years, love the game world design (and how the city felt lived in/alive, always my biggest complaints with open world games, they just feel dead, whcih is immersion breaking for me) , characters etc. but I'm a big fan of films like fifth element, blade runner etc. so naturally this game has a lot in common with those style of films.



Wouldn't say that tbh, I played first time on my vega 56 so no RT and even then the lighting, shadows, reflections all looked fantastic and far better than other games, still does today too.

its a single player game ? 3rd play through ??? isnt it the same though ?? personally myself I play once and thats me done for single player games its like when I hear people watching same movie again I dont get it
 
FSR is better than RT

ImportantAshamedGoldeneye-size_restricted.gif
 
buuut you need to put in extra £700 for it ? or just use FSR to get 60 fps with RT ? and have close raster to 4090 for £700 less ?
that being said I wouldnt pay £1700 for a GPU let alone £1K, I prefer value per buck in the £600-£750 range maybe next generation :)
My 3080 Ti already gives me 60 FPS with RT and DLSS Performance.

The raster performance jump isn't that important to me because I don't notice any perceptible differences past 80-90 FPS and my 3080 Ti already gives me that with DLSS Quality. If I start getting higher FPS over 100 or so, I use DSR to clean up the image as that way atleast I can use the headroom to meaningfully improve image quality. But I would prefer to use the headroom for RT first and then DSR, in order of visual impact. There is also the fact that even the most powerful CPUs on the market start bottlenecking the 4090 in raster if you turn on DLSS.

I agree the card is very expensive but its my hobby and I love shiny new hardware.
 
Status
Not open for further replies.
Back
Top Bottom