• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

He's one of the 4k native crowd thinking path tracing at 4k native is going to be possible but not realising that it's not even relevant for modern gaming and won't be going forwards as we now jump right into AI rendering on top of AI upscaling.

I said it last week, the cryers of frame gen and upscaling will have a new thing to cry about come Jan/Feb.

4k native = best image quality and the benchmark for DLSS to aim for. Will be interesting to see how many years it takes for DLSS to match native resolution without bugs.

Of course I'm forced to use DLSS on my 4090 to get FPS close to 100 in modern games, as the 4090 is too slow at native 4K. Though DLSS does have bugs and issues, visual artifacts with certain textures/areas in some games. It's improving, though still not close to native.

5090 will enable more games to be playable at 4K native, which is exciting to me :)
 
That's maybe part of it. The other part is Nvidia charging through the nose and AMD thought they can just come in a little cheaper without said features and hope it would work. If nvidia is charging through the nose so are AMD.

They need to release cards at sensible prices on release and not wait for nvidia to release their cards first.

Had AMD come in at £599 and £799 for the 7900xt 7900xtx they would have sold loads more.

Surely possible as these guys are charging through the nose no?

Agreed and I should have pointed out that a large part of the problem is AMD and their stupid pricing. They were a lot more sensible with the GRE and the 7800 XT but by then it was too late to impact market share. What’s the betting they go full retard on the 8800 XT (or whatever they call it).
 
4k native = best image quality and the benchmark for DLSS to aim for. Will be interesting to see how many years it takes for DLSS to match native resolution without bugs.

Of course I'm forced to use DLSS on my 4090 to get FPS close to 100 in modern games, as the 4090 is too slow at native 4K. Though DLSS does have bugs and issues, visual artifacts with certain textures/areas in some games. It's improving, though still not close to native.

5090 will enable more games to be playable at 4K native, which is exciting to me :)

Lies. Mrk says 4090 is way too powerful and it will take a game much more intensive than Cyberpunk to make it worth upgrading.

Yes, he said this earlier this year.
 
QmR73HQ.png
 
So 4090 was a super halo part with lots of talking points like FG, PT etc.
4070 and below were the volume parts we the same feature set but too little VRAM to use some of the big talking points like FG.
The halo part oversold the other parts like in previous generations. Like the previous gen 3070 out selling 6800 by a huge margin because in heavy RT titles it could be 50% faster or so - but in real life both were unplayable like 16 vs 24 FPS or similar!

I suspect that 5070 will match a 4080. But only where the 5070 isn't VRAM limited. Nvidia's AI texture compression to the rescue? Well maybe but texturesm compression is one thing, the overheads for FG, RT, and PT might be something else entirely.

I wouldn't mind the above so much as long as "journalists" would report things honestly but I suspect Nvidia's PR machine will once again be setting the narrative. Hence nobody who values releases day access will dwell on shortcomings like the latest greatest feature requiring more VRAM than the mainstream cards actually have.
 
I think this is one of the metrics a lot of the mainstream press and most enthusiasts miss.

OMG Nvidia are 30% faster in extreme RT. They don’t realise that’s because 33% faster on their 4070 is 34 instead of 26 FPS and neither are playable. When you add the obligatory upscaling the 33% becomes 5% - 10% and you are now comparing ~60 FPS vs ~65 FPS.
 
Last edited:
have a simple benchmark while buying a gpu just go by the official transistor count it only failed me once with the rx vega 64 which had more transistors than a 1080 ti, but has been pretty much bang on otherwise - been ages since i have looked at mainstream gpu reviews
 
have a simple benchmark while buying a gpu just go by the official transistor count it only failed me once with the rx vega 64 which had more transistors than a 1080 ti, but has been pretty much bang on otherwise - been ages since i have looked at mainstream gpu reviews

I try the same but it's not worked for many years 'cause I can only count to 11 then I run out of fingers :(
 
7800xt is more like 6800xt / 3080 performance on pure raster anyway
I think he (@humbug ) means once you’ve undervolted and overclocked the 7800xt. I get a 22271 Timespy score with mine (rock stable for all my games) so not too far off stock 4070ti scores. So I basically have upped performance over stock 7800xt by about 10%.
 
Last edited:
have a simple benchmark while buying a gpu just go by the official transistor count it only failed me once with the rx vega 64 which had more transistors than a 1080 ti, but has been pretty much bang on otherwise - been ages since i have looked at mainstream gpu reviews

That's definitely bro science logic

That's because you can't compares paper specs between architectures - many things other than transistors affect performance

Buying GPUs only because they have more transistors is the same as buying a car simply because its engine has higher displacement than your previous car - which again, is not the only indicator of performance and will get you burnt
 
Last edited:
I think he (@humbug ) means once you’ve undervolted and overclocked the 7800xt. I get a 22271 Timespy score with mine (rock stable for all my games) so not too far off stock 4070ti scores. So I basically have upped performance over stock 7800xt by about 10%.

I'm just looking at stock across all otherwise it gets messy , and looking at overall gaming averages

iqEOXO9.jpeg
 
I think he (@humbug ) means once you’ve undervolted and overclocked the 7800xt. I get a 22271 Timespy score with mine (rock stable for all my games) so not too far off stock 4070ti scores. So I basically have upped performance over stock 7800xt by about 10%.

Almost pointless looking at 3dmark only imo. Need to look across many games. Oh and I can overclock my 4070Ti also.

I am actually using the said OC profile when playing STALKER 2. I get the extra fps and extra heat which is welcome :cry:

I have 5 profiles on afterburner. The one I typically use cuts power usage dramatically and looses a few percent performance. In many games I get 120-140w when playing with that profile. Brilliant really.
 
Last edited:
Back
Top Bottom