• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I think we need one of those Reality Check articles the BBC put out for silly people in this thread :)

But i believed the "great leathered one" and thought I was going to get 60-120fps with RT and true 4k this gen (no DLSS). My LEg OELD 48"CX demands it!
 
having seen it first hand my subjective opinion is that it is incremental at best.. i would have rather liked to see revolutionary breakthroughs in character and story AI that would have made 'write your own story as you play' scenario possible... but the processing power, if you ask me, is being wasted on a special effect..

though i am not an expert to comment on whether there are hacks for out of scene reflections and indirect illumination with rasterization (was thinking of reading a book or two but could never find the time, interesting topic as it is), can only hypothesise that some tricks should be possible, it wont be able to provide an accurate RT render but it should be able to trick the human eye into some kind of an optical illusion

Once you've seen a proper RT implementation for things like accurate real time indirect light (both in illumination and reflection) you won't easily be fooled by attempts to trick the eye with hacks which will look quite dated in comparison.

Problem is if a game has to support both RT and non-RT rendering it causes a huge compromise - you lose a lot of RT performance in a game that utilises bits of both and it doesn't show off what RT can do - alternatively you need to maintain a separate branch of the game assets and world maps which don't have any of the hacks in place used in traditional rasterisation which can interfere with a fuller implementation of RT which is a lot of work for developers so somewhat RT is being hamstrung at the moment.
 
Once you've seen a proper RT implementation for things like accurate real time indirect light (both in illumination and reflection) you won't easily be fooled by attempts to trick the eye with hacks which will look quite dated in comparison.

So what you're saying is, don't ever use Ray Tracing? Gotcha. Like having Nigella in evening wear hand carve you some oven roasted goose, then having to go back to asda smartprice turkey roll sarnies.

I just realised I never bothered using the feature, I bought my 2080ti for 4k60, not some embryonic technology that won't be properly integrated for about a decade.
 
The game is not out yet and it's only 30 mins long(The origanal Bright Memory is out).
But I will try the demo if it comes out.

Its not but the benchmark is (how else did the benchmark results appear)
Would be interested to test my 2080Ti with and without RTX.

If the PS5 can do 60fps solid at 4k and the 3080 can't then it's a PS5 for me
 
Hmm if the heatsink is 65, the core is surely going to be at least 80 ?

it points to that but with no soild numbers to work from its all guess work at this point, i think this will be like fermi and waterblocks are going to be a must to get the most from the cards with out them melting down XD
 
This RTX benchmark screenshot could be anything, could have been benched on a 2500K system or something daft for all I can tell.
 
We don't know. Afaik benchmark is not publicly available yet.

But if I tell you the 3080 gets 20 fps in a game at 1080p, what difference does it make what other gpus get? Would you be okay with that?

Because it's about relative price/performance, not just isolated figures.

If another GPU cost half as much but got 19fps in the same game with the same settings, it makes a lot of difference.
 
So what you're saying is, don't ever use Ray Tracing? Gotcha. Like having Nigella in evening wear hand carve you some oven roasted goose, then having to go back to asda smartprice turkey roll sarnies.

I just realised I never bothered using the feature, I bought my 2080ti for 4k60, not some embryonic technology that won't be properly integrated for about a decade.

Exactly +1 what's the point in upgrading then. Some people are deluded, if your gonna call it a next gen ray tracing card at least back the thing up with today's titles never mind next years jeez unbelievable.
 
Ive decided to wait now. There is too much up in the air with RDNA2 and Zen3 coming, and new VR headsets being released (G2 and Quest 2) but not out yet. Need to wait for the dust to settle so I can properly evaluate what I need and at what overall cost.
 
that's some twisted logic... there will be some definite improvement in RTX off scenarios as well
and i have become a big fan of DLSS.. i have even started to think of scenarios where raytracing may get abstracted into DLSS.. instead of actually shooting those rays they would rather create a unified DLSS model that would be able to predict scene raytracing illumination basis game object types and their placement... the rtx method looks like naive donkeywork to me.. it just lacks the kind of elegance that we have come to expect from human generated algorithms

Exactly +1 what's the point in upgrading then. Some people are deluded, if your gonna call it a next gen ray tracing card at least back the thing up with today's titles never mind next years jeez unbelievable.
 
that's some twisted logic... there will be some definite improvement in RTX off scenarios as well
and i have become a big fan of DLSS.. i have even started to think of scenarios where raytracing may get abstracted into DLSS.. instead of actually shooting those rays they would rather create a unified DLSS model that would be able to predict scene raytracing illumination basis game object types and their placement... the rtx method looks like naive donkeywork to me.. it just lacks the kind of elegance that we have come to expect from human generated algorithms

What your basically saying is, you'd rather have DLSS than ray tracing? So we've got another beta card about to be released to the market with very little software to support it, there's is only 1 yes that's 1 title that makes proper use of DLSS according to experts and that's control unfortunately that's pants in my book, I'd have rather they doubled up on ROPS because that's what we need for the near future not some half baked beta implementations that software developers still have to have permission from Nvidia on and their expertise to have it properly implemented, this ultimately explains why Nvidia has shown us control and boasting about it because they know damn well it's the best implementation of it.

I can see the mockery of #RTXOFF popping up again in the very near future.
 
From that screenshot, its clear there on the 3080 its better to turn RTX off. It thats the case, the real question is how do other cards perform with RTX off?

As simply put, RTX still seems to be fairly half baked and not worth the FPS hit... which further then adds fuel to the fire of why go for NVIDIA if RTX is better off and we grab the extra FPS?
Cause I dont care about FPS pass 30 and like Nice shadows and reflections.
 
Back
Top Bottom