• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

If a person has no interest in RT then anyone with a 7800+ doesn't really need to upgrade anytime soon. Myself for example, I turn RT on in a game and say "oh shiny" then turn it off again. Correct reflections and shadows have never been important to me and I'd wager it's the same with the vast majority of everyday people who just want to play games.


Do you have a high end GPU?

I hear ya, but I also know from experience that some people who buy high end GPU's is cause they want to max out all the settings and if you're not then what's the point of a high end GPU right

Maybe it's all about framerate now, things have certainly changed in the gaming landscape, I come from a time where when I first started PC gaming, people who had high end GPUs could get an advantage in multiplayer games due to the vast differences between low and high settings it was almost like two different games back then. And now low and high almost look the same in just about every game
 
Do you have a high end GPU?

I hear ya, but I also know from experience that some people who buy high end GPU's is cause they want to max out all the settings and if you're not then what's the point of a high end GPU right

Maybe it's all about framerate now, things have certainly changed in the gaming landscape, I come from a time where when I first started PC gaming, people who had high end GPUs could get an advantage in multiplayer games due to the vast differences between low and high settings it was almost like two different games back then. And now low and high almost look the same in just about every game

He's in the market for one, currently with Nvidia, i forget exactly what it is, if he even told me :D
 
GTX 970
GTX 1070
RTX 2070S

I buy in the mid range, for 1440P, but upgrade every generation.

That only stopped when Jenson vomited what he called the RTX 3070, i still didn't buy AMD, struggled with the 2070S for another year and still didn't, then Jeson vomited again, i still didn't act but now i was getting itchy and increasingly peed off.

Until i decided, i had to decide as my GPU's third year was now long behind it, i did a lot of soul-searching and a lot of research and realised, among other things.... that for what it would cost me to get a good 4070 class card, one that didn't annoy me, one that i actually like and could take pride in owning, it would actually cost me getting to near as much as an RX 7900 XT, the price difference to move up to that card was less than the Sapphire RX 7800 XT Pulse was the other way, which i had been eyeing up for months knowing it was a good, solid version of that GPU.

So i bought it.

Had the 4070, vinila, been £480, maybe £500, for a good one, i would have bought it, what i really wanted was 16GB, which the 7800 XT also had but would have settled for the 12 GB with DLSS, but its silly money for one with a cooler that's as good as the one that's on mine.

And you know what, in the last 2 years quite a few people in my Discord have ditched Nvidia for AMD, i was pretty much the last one still holding out...
Fair enough. Enjoy it! :)
I checked the RTX4070 Super Ghost review on TPU:



The RX7800XT Pulse has a decent cooler and it can be had under RRP too. The RX7700XT version is at around 26 dBA under load according to TPU. Although all of these cards could do with a price cut. RX7800XT/RTX4070 should be really close to £400.

TPU has the 7900xtx at 39.2 db. I don't think anyone complained about the sound, just the hot spot bug. And is no wonder, 40db is a quiet library sound/office. Moreover you have a 6*c difference until it throttles back. Over engineered/spec coolers add to price.
 
Last edited:
4070 super 549 pounds
7800xt 480 pounds (but is slower even in raster)
7900GRE 510 pounds (about the same - 1-2% doesn't really matter).

So is more like 39 pounds / 69 pounds - but for the later you do get a better card for the extra money even in raster.

If AMD copy pastes the same strategy with next gen... nothing really changes.

Not to mention as per usual he is ignoring everything else Nvidia offer :cry: But nah, you're just paying for dlss and nothing else..... :o

If a person has no interest in RT then anyone with a 7800+ doesn't really need to upgrade anytime soon. Myself for example, I turn RT on in a game and say "oh shiny" then turn it off again. Correct reflections and shadows have never been important to me and I'd wager it's the same with the vast majority of everyday people who just want to play games.

I'm not that overly bothered on them being "correct" (unless the games raster looks beyond bad like DL 2 raster) especially when devs can do very convincing jobs with raster (even if it does take them months/years to get to the same level), for me it ended up being more to use them in order to get away from the downright broken visual glitches that is present with raster e.g. horizon forbidden west, which is crowned as one of the best looking games and it does look absolutely stunning at times but it's raster implementation drags it down to ps 4 levels in so many scenes and is very jarring and immersion breaking, it's like UE 5 tech demos with the matrix, looks so good but the animation sticks out like a sore thumb:




The point with RT is that AMD could somewhat get a pass mark by lacking here for rdna 2 and arguably rdna 3 but going forward, they absolutely need to get this improved for rdna 4 especially if we start to see more games like avatar and spiderman 2 which won't allow users to turn off RT. To have your top end current gpu only matching your competitors 3+ year old 2nd/3rd high end gpu in this area is quite frankly embarrasing imo.
 
To have your top end current gpu only matching your competitors 3+ year old 2nd/3rd high end gpu in this area is quite frankly embarrasing imo.

AMD's net worth is 1/10th of Nvidia's, I wouldn't say embarrassing considering Nvidia have a LOT more R&D cash to throw at things like RT than AMD does.

Nvidia also hires the best and brightest blood thirsty types straight from places like MIT where as AMD tends to hire the hippy "lets make everything open source maaaannnnn" types which isn't helping them in the long run.
 
Last edited:
AMD's net worth is 1/10th of Nvidia's, I wouldn't say embarrassing considering Nvidia have a LOT more R&D cash to throw at things like RT than AMD does.

Nvidia also hires the best and brightest blood thirsty types straight from places like MIT where as AMD tends to hire the hippy "lets make everything open source maaaannnnn" types which isn't helping them in the long run.

It's not like amd aren't worth billions here.... They invest a considerable amount into their R&D:

KQkxob6.png


dKfFxwg.png


The reason is as we all know and agree on, pc gaming simply is not amds focus, they have far more lucrative business paths and as long as they have consoles, they probably won't care, which is completely ok, just not great for pc gamers when one side is dominating. That and I think amd are just very poor at customer relations with devs in the sense, they are wanting to be as hands of as possible (one of the reasons for open source) and don't really have that looking ahead capability of "what's next", which then leaves them on the back foot and having to scramble to get something out the door, the fact that fsr still isn't widely in xbox and ps games says it all imo.
 
It's not like amd aren't worth billions here.... They invest a considerable amount into their R&D:

The reason is as we all know and agree on, pc gaming simply is not amds focus, they have far more lucrative business paths and as long as they have consoles, they probably won't care, which is completely ok, just not great for pc gamers when one side is dominating. That and I think amd are just very poor at customer relations with devs in the sense, they are wanting to be as hands of as possible (one of the reasons for open source) and don't really have that looking ahead capability of "what's next", which then leaves them on the back foot and having to scramble to get something out the door, the fact that fsr still isn't widely in xbox and ps games says it all imo.

For GPU's specifically though, AMD's R&D budget is a pebble compared to Nvidia's Everest sized budget.
 
Last edited:
AMD's net worth is 1/10th of Nvidia's, I wouldn't say embarrassing considering Nvidia have a LOT more R&D cash to throw at things like RT than AMD does.

Nvidia also hires the best and brightest blood thirsty types straight from places like MIT where as AMD tends to hire the hippy "lets make everything open source maaaannnnn" types which isn't helping them in the long run.
Intel was also much bigger than AMD, but with a smart move with Ryzen they've managed something good. RDNA 4 should go in that direction if they wanna change something.
 
Last edited:
Intel was also much bigger than AMD, but with a smart move with Ryzen they've managed something good. RDNA 4 should go in that direction if they wanna change something.

Yep, I'm curious if they also go the chiplet route with their GPU's too but instead of having them spaced out like Ryzen, Having them connected at the silicon level like Apple does with their top end silicon.
 
I switched from 8 years with Nvidia to AMD.

I would only use upscaling tech as an absolute last resort, this is the problem with people trying a convincing argument as to why i should still be buying Nvidia, upscaling tech is a critical part of the equation, for me it should not even exist, you're willing to pay more money for a lesser card and then make back the difference with DLSS.
To me that's asinine.

Just buy the better card, for less and then not need it.
I am on a 4090 and I still use upscaling to drive my 240hz OLED monitor because I cannot see the difference in quality in motion and the motion clarity of 150-200 FPS on an OLED is insane. I actually subscribed to PureDark Patreon for his DLSS mods on non supported games.

I would rather use a less sharp image from DLSS, lay on RT on top than use native TAA. Most games TAA is just horrible. Just look at Forbidden West TAA flickering on vegetation. Even DLSS at 58% render scale makes it look better.

The higher FPS is just a bonus. To me, just losing the option and customisation of DLSS (enabling the use of DLDSR and RT) for the prospect of being able to run textures just a notch higher is just not attractive enough.
 
Intel was also much bigger than AMD, but with a smart move with Ryzen they've managed something good. RDNA 4 should go in that direction if they wanna change something.
AMD won against Intel partly because of Intel's own missteps which gave AMD time to get it's act together. Nvidia is not Intel and won't give them this option.

Rumors are the 5090 is 70% faster than 4090 and with AMD languishing on 7900XTX levels of performance, Nvidia will leave them in the dust before long.
 
True although it's probably safe to say nvidias focus in terms of gaming advancements for dgpu space is also considerably down too with the focus being put into AI, data centres etc.
There is no need for them to invest as there is no competitor at the moment. For the 5000 series, their only competition are their own cards.

They didn't even bother to release 4090 Ti and 4080 Ti this generation. I expect a considerably slimmed down lineup with 5000 series.
 
AMD won against Intel partly because of Intel's own missteps which gave AMD time to get it's act together. Nvidia is not Intel and won't give them this option.

Rumors are the 5090 is 70% faster than 4090 and with AMD languishing on 7900XTX levels of performance, Nvidia will leave them in the dust before long.

and charge whatever they want, Great Stuff , I'll stick to what I can get in the £500-£600 range
 
The higher FPS is just a bonus. To me, just losing the option and customisation of DLSS (enabling the use of DLDSR and RT) for the prospect of being able to run textures just a notch higher is just not attractive enough.

This is something I always find amusing too, them lot always make it out like it is the end of the world having to turn down the texture setting one notch as they can see a "huge difference":

iOwLJEt.jpg


Irw9RAO.png


9I2tOMh.png


tFptono.png


BrSCTTZ.png

Meanwhile, those same people can't see the difference RT makes in games:




It's just a case of hypocrisy and irony again.

I do hope we see more games benefit from the higher vram count though and a genuine benefit in increased visuals and not just simply as a way to brute force/workaround sloppy optimisation, this is my core problem with these arguments of having higher vram benefit, it isn't really providing any real noticeable benefit to games visuals unlike RT and as evidenced, a few patches later resolves memory leaks and/or reduces vram usage such as happened with TLOU, HZD etc. Devs seem to have found a way of optimising UE 5 games now to use considerably less vram too despite the first couple of UE 5 games looking quite poor on this front. HFW textures look fantastic but its visuals are brought down by the limitations of raster effects.

Essentially TLDR: we need to see more games making proper use of vram to justify the benefits of having more. Much the same way people say we need RT perf/grunt to be better at the lower to mid end for it to become more sought after by gamers.
 
This is something I always find amusing too, them lot always make it out like it is the end of the world having to turn down the texture setting one notch as they can see a "huge difference":

iOwLJEt.jpg


Irw9RAO.png


9I2tOMh.png


tFptono.png


BrSCTTZ.png

Meanwhile, those same people can't see the difference RT makes in games:




It's just a case of hypocrisy and irony again.

I do hope we see more games benefit from the higher vram count though and a genuine benefit in increased visuals and not just simply as a way to brute force/workaround sloppy optimisation, this is my core problem with these arguments of having higher vram benefit, it isn't really providing any real noticeable benefit to games visuals unlike RT and as evidenced, a few patches later resolves memory leaks and/or reduces vram usage such as happened with TLOU, HZD etc. Devs seem to have found a way of optimising UE 5 games now to use considerably less vram too despite the first couple of UE 5 games looking quite poor on this front. HFW textures look fantastic but its visuals are brought down by the limitations of raster effects.

Essentially TLDR: we need to see more games making proper use of vram to justify the benefits of having more. Much the same way people say we need RT perf/grunt to be better at the lower to mid end for it to become more sought after by gamers.
Somehow in your YT cyberpunk video I can see a massive difference in path tracing vs even RT- amazing. Maybe I should play through the game again and try to take in the sights? ;)

But I only get 20fps in path tracing :(
 
Last edited:
Back
Top Bottom