• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Just what is NVIDIA up to?

20+ years ago problems with games not always running properly was part of the PCMR experience, with both ATI/AMD and Nvidia there was often something odd going on, you accepted it because you were used to it, that's how it was.

To begin with Nvidia put a lot of driver work in to try and make that a thing of the past, with success.

However since then and for some years now AMD have also put that work in, i couldn't tell the difference between the 5700XT and the 2070S, in terms of smoothness, responsiveness, dependability and all of that.... they are exactly the same.

Apart from one thing, at the time my mate @pete910 had a GTX 1080, we started playing Insurgency.

My mate with his 1080 "i've got purple crates, why do i have purple crates?"
Me, RX 5700XT "the #### are you talking about?"
My mate: "all the crates in the game are purple"
Me: "no, they are wood"
My Mate: "here's a screenshot"
Me: "hm? Purple crates"

One RTX 2070S later.
Me: "ah... there's my purple crates"

It got fixed, with a driver, eventually.
LOL, how long was till it was fixed?

There was a few other games with various issues as well IIRC.

But ya know,

Nvidia drivers good!
AMD drivers bad.........


What a load of **************************
 
Sounds like a recipe for a pick a mix of badly validated products and a poor customer experience as driver updates push one or the other 'brand' over the edge.
I do think we could have more variation in allowed optimisation as shipped from factory to match the often oversize 'marketing' coolers instead of the "2%" OC models for £££ extra.
AMD seem to have a habit of running the voltage too high for 'extra stability' so a vendor could better optimise the out of box performance, with voltage tuning and a power limit boost to suit the cooler.

All we need is a simple product stack at perceived 'value' price points instead of the usual micro-segmentation we get through each generation.

AMD's stack is not unreasonable at the current price points, even if it's a mix of generations.
At least they fixed the $100 7900XT/XTX gap that killed the XT launch reviews.
Clearly AMD has not achieved it's performance goals for this generation and is late on features it promised at launch so it needs to offer value.

Nvidia on the other hand made a huge leap in architecture, however chose to push up prices and cut down cards resulting in a meh generation for most segments.

Right now, I need a GPU for a budget box so the kids mates have something better than a lenovo tiny to play on when the come round.
Looking at 6600/7600 as it will be paired with I7 4970, older cards might be OK, say 5600XT but Starfield sweetens the deal somewhat.

On the other side, tempted to grab a 7900XT and shove my current GPU in the budget box.

Nothing tempting me from Nvidia this Gen, though I'd take the 4080 at £800.

Hm... yeah ok.

My idea vs reality.

 
Or for AMD, do you stick with the last generation and grab a 6750xt / 6700xt?
I'd personally get a 6700XT at £299 to 'get by' for now and see what the next generation cards bring to the table. This gen doesn't seem to be very good value generally. It's either that or wait to see what the 7700XT is like when it should launch at Gamescom later this month (rumoured).
 
As a GPU the 5700XT was great, ran perfectly, good and smooth performance, awesome drivers, bad cooler, really really bad cooler, i'll say this, Nvidia would never allow anyone to put one of their GPU's with a cooler like that on to the market.
Yeah it ran perfectly apart from the black screen issues that plagued tons of users including my self.

Those black screens were so frustrating that I vowed at the time to never come near an AMD card again.
 
Holding back RT progress?

Surely that's not just consoles and AMD. It is basically everything except the 4090 including previous halo cards like the 3090 Ti and 2080 Ti.

Biggest problem I see with RT is that unless all games are 100% developed with no raster fallback, then RT won't look that good. To me Cyberpunk not only doesn't look great despite being so slow, with RT on the lighting is often wrong. Only totally dropping raster and creating games only for RT can solve this.

Even a 4090 would struggle then. Since all games are console first, then next gen consoles will need 4090 RT performance at the minimum IMO.
 
Last edited:
I think with the news of AMD not going to be competing at the high end, we can be sure the RTX 5000 series will be far overprices possibly even higher than the 40 series and probably not as big performance leap for the 90 given if there is no competition why the need for such a big jump?

It's a card I'll want so badly but all I can do is look at and not touch despite much want.
 
Holding back RT progress?

Surely that's not just consoles and AMD. It is basically everything except the 4090 including previous halo cards like the 3090 Ti and 2080 Ti.

Biggest problem I see with RT is that unless all games are 100% developed with no raster fallback, then RT won't look that good. To me Cyberpunk not only doesn't look great despite being so slow, with RT on the lighting is often wrong. Only totally dropping raster and creating games only for RT can solve this.

Even a 4090 would struggle then. Since all games are console first, then next gen consoles will need 4090 RT performance at the minimum IMO.

The thing with RT, the 4080, a card similar in raster to the 7900XTX is 16% faster in RT on average, according to TPU at 4K, those averages include Cyberpunk.

Take Cyberpunk on its own, the 4080 is 45% faster, that's what we like to think of as the natural RT performance difference between Nvidia and AMD, all the tech tubers use it as a reference.
But it isn't, its one of a couple of outliers, its a black boxed Nvidia RT showcase title.

We all know Nvidia have a history of using sponsored game titles to code in such a way that it kills their own performance, great buy a GPU from our more expensive range, a 4090 perhaps? but more so on AMD.
We all know Nvidia does this.
Its obvious they are doing this right now.
And yet we all pretend like Cyberpunk is the natural order of things. Despite knowing its an outlier.

Nvidia marketing. they can convince one its raining as they are peeing all over you.
 
Last edited:
I always thought ray tracing was a gimic, I have an RTX 3070, I've never use ray tracing apart from 1 game I tried it, it made it look worse and run like ****.
 
Anyway:
PS3 (2006) > PS4 (2013) = 7 years. 230 GLOPS >> 1,840 GLOPS ( x8)
PS4 (2013) > PS5 (2020) = 7 years. 1,840 GFLOPS >> 10,300 GFLOPS ( x 5.6)
So PS6 by 2027?
Well Moore's Law is more or less dead, so while ordinarily we might expect about the same again (x5 to x8) which should give us 10,300 * 6 or (61,800 GFLOPS so pretty close to RTX 4090), I suspect it will be less.

And for realistically all out RT even the 4090 is too slow. I'm sure some clever design will help (current RT is pretty much brute force after all), for those who want everything to be RT-only... PS7?

Purists wishing to ditch all the raster tricks are being totally unrealistic anyhow. Problem is that most RT I have tried has looked worse even if occasionally a bit more realistic. (I personally play games for fun not to look at 95% black screens no matter who realistic that might be - also don't fancy having my retina burned when a 95% black screen suddenly has a 1,500 NIT explosion.)
 
Back
Top Bottom