• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,028
Location
Devon
Don’t get me wrong, I bought a 2080Ti for the raw performance, not RT, but to say RT is just eye candy and irrelevant is to do it a disservice.

The same has been said about every “new” tech that’s been added to GPU’s over the last 30 years but that’s how they become mainstream eventually.
 
Associate
Joined
30 Jan 2016
Posts
75
Its not "important" in anyway? Its just as bit of graphical candy.

RT is important in more ways than you clearly understand maybe look into how current lighting techniques are done in games and how many hours it takes to manually light scenes its a painstaking process and not to mention the obvious benefits to eye candy, you also have to remember current games that have RT were not built from the ground up with RT in mind but have RT effects added on.

With the consoles also adopting RT I'd say more people are interested or at least curious about the benefits RT can bring but the impact it has on game development is not something a lot of people realise.
 
Associate
Joined
23 Aug 2005
Posts
1,273
Soon the question will be moot. Both AMD and nVidia cards will have RT tech in them going forward. There won't be an opt out option. You will be paying for it whether its enabled or disabled in game.
 

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,360
Location
kent
Exactly, Nvidia are very good at that and I mean that as a compliment. Bad for the consumer but as a business very smart.

Not just NVIDIA, we all saw the 5700xt that was supposed to be called the 690.
If that is not renaming and then charging more, then I must be clueless.

Nd the guy that said, nobody cares about RT.

Really.......
I mean really......

If that is the case then why whenever you see any presentation about the next gen consoles or next gen GPU'S, hardware raytracing is mentioned.

Could it be that it is actually a big thing and that we are just starting out down the path. Things will improve.
Just remember that next time you watch a movie, most of the special effects are RT, unless your watching some really bad movies.:).
But nobody cares about RT so let's all go back to CGA sprites, because who needs progress.

Opps went off on a bit of a rant here apologies.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Expect limited RT usage still. Don't believe me? Go look at the UE5 demo again, it's focusing on rasterisation rather than RT for its GI. Just like we suffered with poor lighting this gen because consoles were too weak for even half-decent GI solutions, thus it will also be next gen vis-a-vis RT.

RT will be the extra eye-candy PC can enjoy over consoles, nothing more. And people believing 4x performance (RT) over Turing? LOL. Not even in the fully path-traced games. Go look at how it's achieved (re Shaders & RT cores) and then think about the hardware that would be required for that. It's ridiculous.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,171
RT will be the extra eye-candy PC can enjoy over consoles, nothing more. And people believing 4x performance (RT) over Turing? LOL. Not even in the fully path-traced games. Go look at how it's achieved (re Shaders & RT cores) and then think about the hardware that would be required for that. It's ridiculous.

We are at a fairly early stage in terms of hardware RT implementation - still at the point where optimisations can make for big jumps in performance - 4x or even higher jump isn't out of the question without anything stupid hardware wise - additionally a good slice of the performance considerations is related to memory bandwidth and caching as well which is far from optimal for RT in Turing.
 
Soldato
Joined
20 Aug 2019
Posts
3,031
Location
SW Florida
But you will once it's mainstream. The console generation are making that so very very soon.

Once it becomes mainstream, the 3000 series will be too old and too slow to run it well.

RT will eventually be nice, but the early-adopter tax just isn't delivering the goods right now and I seriously doubt it will with the 3000.

Now, if they can get pricing down where the cost of RT to the consumer is buried in the "noise", we can enjoy our equipment while the manufacturers continue advancing RT as best they can.
 
Soldato
Joined
16 Jun 2004
Posts
3,215
Not seeing anything in these leaks that will change my mind on skipping the 3xxx series and waiting for 4xxx cards in 2022-23 to replace my 2080, TBH.

I really can't see (Native) 4K 60fps RT being achievable until the 4080/Ti cards arrive. The 3xxx cards still look like they will only be able to do it by relying on 'smoke & mirrors' DLSS & VRS trickery.

I paid £650 for my 2080 in January 2019 and my simple rule for upgrading my GPU is that I won't upgrade until I can get (at least) double the performance of my current GPU for the same price, and that just doesn't look likely with these 3xxx cards.
 
Associate
Joined
14 Aug 2017
Posts
1,195
Have to agree. Raw performance is by far the most important factor - 95% important and RT maybe 5% at most currently. The next gen cards need to be capable of pushing 3x 4k screens for the likes of PC3, or greater than 4k displays at 100hz for VR. Got to get that right first and foremost.

Raw performance at what? Drawing polygons as fast as possible? So anti-aliasing has been a waste of time? And Variable Refresh-rate tech? And all the other bells and whistles that are now just considered part of that raw performance, but people said similar things about when they were introduced?
My graphics lecturer in the mid 90s was on about hardware accelerated raytracing being the future. It's not just 'candy', it's a major change.
 
Back
Top Bottom