• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

You say all that but I'm more interested in 9070/XT over 5070/ti , if the value is better with rasterization, playable RT and hardware upscaling is good

What's Frame reprojection ? Is that frame generation? Couldn't care less for that its even more worthless the lower GPU you go because you need certain base FPS

I am surprised how many people don't understand that MFG is basically a gimmick to fool people into thinking that they can buy the 5070 and it will outperform the 4090 at a 1/3 of the price (if your lucky).

I think we're going to have to get used to wishy washy graphics in the future because a lot of it is going to be upscaling, ai powered MFG with a few more software bells and whistles added to boost performance stastics on paper but make everything look a bit more **** and with the copium in people whether it's Nvidia or AMD fanboys, they'll fool themselves into thinking this is progress.
 
Mindshare is mostly nonsense, and condescending to the customer. People buy Nvidia because they innovate new technologies add a ton of features, and then bring those technologies and features to games and applications. They add additional value, and many PC gamers have interest in areas outside gaming otherwise we would just get consoles. You can argue we don't need those technologies if you want, but as those techniques mature that is becoming an increasingly difficult argument. Nvidia brought tessellation, CUDA, Frame reprojection, NVENC, Variable refresh rate, Ai Upscaling, Ray tracing, Frame generation, RTX HDR, Broadcast, Reflex, mega geometry, Neural rendering during that time AMD has just been reactive and didn't innovate, they have some of those features now but only because Nvidia did it first, and a lot of the time those features were tacked on to existing hardware not planned or designed for.

In the 5000 series alone Nvidia has added DLSS transformer model, mega geometry, Neural rendering MV-HEVC, 422 chroma encode/decode. Transformer model is looking like a big reason to choose NVidia unless FSR4 is extremely special, and the improvements to NVENC are massive for content creators and VR users.
Transformer model for DLSS is honestly overrated atm, most of the notable differences so far is because it looks sharper over the CNN model. But CNN by default is just generally blurry due to a lack of sharpening and using a sharpening filter with it automatically brings out more detail to the point where it is on par with the transformer model. I've been testing it for over 2 weeks now and for a lot of games, not only does the CNN + sharpening give better performance, but it also provides a slightly more stable image a lot of the time.

The transformer model tends to produce a lot of localised shimmering on objects and textures that isn't present with the CNN model. Even with ghosting reduction which is the main touted benefit of the transformer model, while there is a general reduction, it's still present with some fast-moving objects and again there are scenarios where things that shouldn't be ghosting are displaying noticeable ghosting. E.g. moving the camera can sometimes cause static objects/textures to ghost where they wouldn't have with the CNN model.

Most of the praise I have seen for it comes from people who complain that DLSS3 was too blurry while seemingly not understanding that they should have been using a sharpening filter, especially after Nvidia disabled inbuilt sharpening with ~DLSS 2.5. So really I think if FSR4 can match DLSS3 image quality and adds a bit of sharpening by default, the differences to DLSS4 will not be considerable at all.
 
lol at so much revisionist nonsense. ATI did tessellation, not Nvidia. AMD did multi monitor gaming, upscaling was a console tech long before Nvidia “invented it”.
VRR was a laptop battery saving tech before Gsync was “invented”

Lots of the tech you attributed to Nvidia “inventing” was actually pioneered long before you think it was, by companies other than Nvidia.

I’m happy to give credit where it’s due but you are literally convincing yourself Nvidia “invented” technologies that were copies of existing technology.

Nvidia take someone else’s idea = innovation.
AMD take someone else’s idea = reactionary.
I did read all that and I instantly thought that sounds a lot like Apple. A lot of buzz words for tech that already existed but wrapped up in new Nvidia marketing bull. lol.
 
You can argue we don't need those technologies if you want, but as those techniques mature that is becoming an increasingly difficult argument.
This is a bit of a double edged sword. When the 2080ti launched it rt was next to useless and you were paying a hefty premium for the privilege, now that it's just starting to be required in a couple games and you can get far better rt performance for far less money.
 
Last edited:
And so it begins, the melting connectors on 5090 are coming to light on Reddit.
I mentioned Nvidia seemingly getting an easy ride over the melting 4000 series and wondered why people were so ready to rely on their, even changed, connectors with even more power hungry cards.

I was assured there was nothing to see here....

My understanding is on the upcoming AMD cards they're the regular types but AIBs haven't been restricted from using the new power connectors.

It'll be interesting to see how many melty issues there'll be on those cards, if any.
 
I mentioned Nvidia seemingly getting an easy ride over the melting 4000 series and wondered why people were so ready to rely on their, even changed, connectors with even more power hungry cards.

I was assured there was nothing to see here....

My understanding is on the upcoming AMD cards they're the regular types but AIBs haven't been restricted from using the new power connectors.

It'll be interesting to see how many melty issues there'll be on those cards, if any.

I think at least one brand was shown with the new connector, colourful maybe? Seemed to be one of the lesser known brand from what i remember.
 
I had an s3 savage4 back in the day.
And we had 3dfx.
So 5 different brands for one brief shining moment in the 90s. Though since there was no consensus on APIs buying the wrong card would often mean it just wouldn't work properly with your games. Fun times :P
 
Last edited:
I'm so tempted to pick of a B580 to mess with while waiting for RDNA 4. I very curious to see one in action and they ain't much here, around 280 quid. Then the trifecta is complete and I can claim myself fully unbiased /s
going by your sig you got one how is it
 
Back
Top Bottom