• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The RT Related Games, Benchmarks, Software, Etc Thread.

Status
Not open for further replies.
The base game is still rasterized as in the architecture of buildings, Weapons etc... and the textures themselves in many places aren't exactly massively high res which helps with performance and it works in M.E.E as it's nowhere near as open as a game like Spiderman 2 will be.

Spiderman 2 will be interesting but I can't see them having everything ray traced, Reflections, Global illumination, Shadows, Ambient occlusion, Emissive lighting etc... while also having good quality with decent FPS. FX will have to be turned down for the higher FPS options.

Yup see my edit.
 
@K1LaW8tt5

My reply was based on a 3080 failing, right now, the 78XT would be replacing 3080.

Going forwards, although a 78XT probably couldn't run everything on ultra, it absolutely will be able to run ultra textures whereas my 4070 probably won't with 12Gb.

So if you had ~£500 to replace your failed 3080 then the 78XT indeed would be a good choice.

Wouldn't be streaming anything though, some of us have standards that's not been watered down with upscaling.

What games are you worried about/have you tested this with?

As I have a 4070 and it seems to use 2-3gb LESS actual vram in game than a 6800xt etc running the same res/settings... Even with RT on I've yet to find ANY game that uses more than 9.3gb at max settings 1440p native, even when I enable RT... With DLSS/FG it doesn't seem to use anymore than a native res game without RT on... In in reality vs the 16gb 6800xt we both end up with the same vram remaining 'spare', mine seems to be more efficient at how much it uses at the same res/settings than the 6800xt, wether that's sneaky compression or whatever, matters not when you can't see it and still end up with the same in reserve for every game tested head to head at the same settings.

I have yet to find a game that even uses the 'allocated' amount of vram when compared to the 'actual' amount I'm using in game with any piece of software showing usage stats on the fly?

Even when I ran TLOU at 4k native high I wasn't using more vram than at 1440p ultra native? And if I bother to do 4k but with DLSS it's again no more than native 1440p, so from someone that owns a 4070, I don't see it being a problem, especially with DLSS3.5/FG.

Cyberpunk uses hardly any vram even native or with RT/DLSS3.5/FG, IIRC about 5.7GB at 1440p max settings with motion blur turned off (personal preference)

TDLR, if you have a 4070 already, I'd be trying it out now vs worrying/speculating on here.
 
Last edited:
Jensen is hardly ever wrong.

vd3favvj20qb1.jpg
 
What games are you worried about/have you tested this with?

As I have a 4070 and it seems to use 2-3gb LESS actual vram in game than a 6800xt etc running the same res/settings... Even with RT on I've yet to find ANY game that uses more than 9.3gb at max settings 1440p native, even when I enable RT... With DLSS/FG it doesn't seem to use anymore than a native res game without RT on... In in reality vs the 16gb 6800xt we both end up with the same vram remaining 'spare', mine seems to be more efficient at how much it uses at the same res/settings than the 6800xt, wether that's sneaky compression or whatever, matters not when you can't see it and still end up with the same in reserve for every game tested head to head at the same settings.

I have yet to find a game that even uses the 'allocated' amount of vram when compared to the 'actual' amount I'm using in game with any piece of software showing usage stats on the fly?

Even when I ran TLOU at 4k native high I wasn't using more vram than at 1440p ultra native? And if I bother to do 4k but with DLSS it's again no more than native 1440p, so from someone that owns a 4070, I don't see it being a problem, especially with DLSS3.5/FG.

Cyberpunk uses hardly any vram even native or with RT/DLSS3.5/FG, IIRC about 5.7GB at 1440p max settings with motion blur turned off (personal preference)

TDLR, if you have a 4070 already, I'd be trying it out now vs worrying/speculating on here.
It was comments made based on a 3080 failing right now needing replaced, I've got a 4070 and not worried about it running out of vram as that's just par for the course being an NV user, my 79XTX does the heavy lifting.

4070 limitations Vs the 78XT is bandwidth and vram and costs a good bit more but has better RT'ing, upscaling/FG and uses a bit less power.
 
Last edited:
It was comments made based on a 3080 failing right now needing replaced, I've got a 4070 and not worried about it running out of vram as that's just par for the course being an NV user, my 79XTX does the heavy lifting.

4070 limitations Vs the 78XT is bandwidth and vram and costs a good bit more but has better RT'ing, upscaling/FG and uses a bit less power.
Ah fair enough, sorry I thought you'd meant you were considering a 4070 but put off, my bad mate!

Mines undervolted stupid amounts and tends to use 105-125w at native 1440p ultra or around 135-145w max with RT/FG/DLSS3.5 on even at 4k dlss... I use 1440p personally, but yeah they just sip power, don't they! I dunno about yours but my stock fan is 65c and other than once in control at 66c for a brief moment, my fan never even turns on and it sits at 53-57c in every game maxed out natively.

TLDR for the fact it's silent and runs cool as all the time with no coil wine and is right next to my leg, I can't fault it, the £192 a year I save from not going 7900xt in electric is a nice bonus too, means after 3 years I've got a free gpu/money towards a 2026 upgrade and can put it in my 2nd mitx rig which I know it fits, or sell it and probably buy a pretty decent tier card for peanuts of my own money! Win Win!
 
Last edited:
I got to admit, I absolutely loved Bryan's answer to all the "fake frames, fake res" outcry :cry: :D Can just imagine the anti fake crowd then:

H5i4BZn.gif


:p

As soon as AMD have FSR3 out those same people that have full blown nervous break downs over Nvidia's frame gen tech on forums will be championing FSR3, Can see it coming a mile off.
 
As soon as AMD have FSR3 out those same people that have full blown nervous break downs over Nvidia's frame gen tech on forums will be championing FSR3, Can see it coming a mile off.
Didn't the FSR3 screenshots show major fidelity/quality loss/major shimmering though? Versus being able to choose/adjust yourself? Or was that clickbait nonsense?

Not biased I currently own 3 AMD cpus and a AMD and Nvidia gpu in each rig...
 
Last edited:
TDLR, if you have a 4070 already, I'd be trying it out now vs worrying/speculating on here.

Yeah if only a few would listen. We did warn people back in 2020 that when gpus are struggling because of limited vram, gpus with higher vram would also struggle because of lack of grunt, essentially you buy what is good for your budget and fits your needs at the time, perfect example, look at the 3090, extra £750 well spent..... Obviously there is no excuse for releasing 8gb gpus in 2023 at £500+ though.

I recall the 4070 was the card these guys were going to buy.. :)

Wrong.

We were referring to the 4080 and well it is a nice upgrade but priced very wrong.

4070 is a great gpu and I sooner have it over a 7800xt. As tna and I predicted though, 4070 caused the 3090 second hand pricing to crash.

As soon as AMD have FSR3 out those same people that have full blown nervous break downs over Nvidia's frame gen tech on forums will be championing FSR3, Can see it coming a mile off.

I personally can't wait for the "DLSS 3 KILLER!!!!!" posts/articles again that will be based of the 1-2 PR showcases only to then see it become a dumpster fire in every other game afterwards :p

In all seriousness though, I'm hoping it will be good and that any issues simply won't be that noticeable given that with nvidias frame gen as per hubs and df etc. findings, you really need to slowing down or/and even picking out the fake frames to easily notice the "awful IQ" issues...... So in theory, it shouldn't be quite as bad as FSR 2 as the upscaling issues are present in ALL frames.

The biggest point will be latency, if amd can somehow manage to have similar latency to frame gen as nvidia then kudos there but given their own info/slides, I suspect the good results will only be on RDNA 3 hardware.
 
  • Like
Reactions: TNA
Yeah if only a few would listen. We did warn people back in 2020 that when gpus are struggling because of limited vram, gpus with higher vram would also struggle because of lack of grunt, essentially you buy what is good for your budget and fits your needs at the time, perfect example, look at the 3090, extra £750 well spent..... Obviously there is no excuse for releasing 8gb gpus in 2023 at £500+ though.



Wrong.

We were referring to the 4080 and well it is a nice upgrade but priced very wrong.

4070 is a great gpu and I sooner have it over a 7800xt. As tna and I predicted though, 4070 caused the 3090 second hand pricing to crash.



I personally can't wait for the "DLSS 3 KILLER!!!!!" posts/articles again that will be based of the 1-2 PR showcases only to then see it become a dumpster fire in every other game afterwards :p

In all seriousness though, I'm hoping it will be good and that any issues simply won't be that noticeable given that with nvidias frame gen as per hubs and df etc. findings, you really need to slowing down or/and even picking out the fake frames to easily notice the "awful IQ" issues...... So in theory, it shouldn't be quite as bad as FSR 2 as the upscaling issues are present in ALL frames.

The biggest point will be latency, if amd can somehow manage to have similar latency to frame gen as nvidia then kudos there but given their own info/slides, I suspect the good results will only be on RDNA 3 hardware.
I personally was VERY against DLSS, till I went and tried FSR1/2 on my AMD card and my 4070 then tried dlss2 then was blown away by 3/3.5/FG/and Super low latency, for once the 'claimed' fps vs clarity (on my card/this gen) was actually true, I'm so glad I didn't get the 6800xt/3070ti I was planning on getting this time last year, I'd be fuming if I'd bought them!
Now having benched my 4070 against my mates 6800xt at the same native res/settings and mine coming out the same or better natively with RT on, I can't really say no for free!
 
Last edited:
  • Like
Reactions: TNA
I will just clarify I own 3 AMD cpu's currently, an AMD gpu and an Nvidia gpu, so I'm not biased... Just was VERY underwelmed when I saw this and how the 7900xt cant do RT and how poor FSR1/2 is, then saw 3 and thought, he we go again...

In all honesty I don't really care that much about FG, FSR3, DLSS, RT etc etc... it's all nice don't get me wrong but my main game is FFXIV and my rig pretty much goes to sleep playing it, I treat everything else as a momentary nice looking tech demo then go back to FFXIV :p
 
In all honesty I don't really care that much about FG, FSR3, DLSS, RT etc etc... it's all nice don't get me wrong but my main game is FFXIV and my rig pretty much goes to sleep playing it, I treat everything else as a momentary nice looking tech demo then go back to FFXIV :p
I just think if I'm buying something brand new, I want the most years warranty/featuresets as a standard. Otherwise the more we let them get away with it, the more they can charge you more for higher 'tier' price brackets with features we should have got as standard. It's just scummy otherwise.
 
Last edited:
For the peasants (you know those on, for example, RTX 3050-level cards), cloud gaming might be the only way to experience this.

Otherwise it's watching youtube videos and wishing the streamer would turn this way or go back and look at something.

That's for RT ultra, psycho and RR.

For Fake Frame / Frame Generation, well let's just say those who keep saying the latency makes it a bad idea... with cloud gaming nobody can be sure the latency is their internet connection, Nvidia's servers, or something inherit with faking frames!
 
For the peasants (you know those on, for example, RTX 3050-level cards), cloud gaming might be the only way to experience this.

Otherwise it's watching youtube videos and wishing the streamer would turn this way or go back and look at something.

That's for RT ultra, psycho and RR.

For Fake Frame / Frame Generation, well let's just say those who keep saying the latency makes it a bad idea... with cloud gaming nobody can be sure the latency is their internet connection, Nvidia's servers, or something inherit with faking frames!
Having played CP with DLSS3.5/FG/Super Low Latency Mode/RT on pyscho at 1440p max settings, I think was genuinely impressed with the input response, for SP it's always going to be acceptable vs MP, but yeah it'll be interesting to see someone try and maintain 1440p/4k in general smoothly let alone with RT on, it's always been a shambles getting 1080p be not fluctuate in res/quality from what I've ever seen from streaming haha
 
Status
Not open for further replies.
Back
Top Bottom