• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

Soldato
Joined
18 Feb 2015
Posts
6,480
The order is: off/on/off/on/off/on

Interesting, that makes DLSS an absolute smash hit then! At least for this game.

Finally the tech itself demonstrates its promise, and now what's left is for them to push it in more games (and at/near launch). Fingers crossed.
 
Man of Honour
Joined
13 Oct 2006
Posts
90,821
So perhaps RT Cores will be dead afterall, I hope even if they don't continue to be added, there will be support for RTX 20 series and that these cores can be utilised for RT in future if "RTX" isn't in a game, because if not, that would be a real bummer, ultimately a 2070S would literally be a 1080Ti even RT performance wise.

That would be one tough lesson to learn for RTX adopters, it would also mean upgrade asap to new GPU to support ray tracing, so probably what will happen :(, almost mis-sold ray tracing

No they won't. nVidia is pushing RT - lot of behind the scenes work going on with it far too much for them to be dropping it any time soon. The actual implementation will change though down the line.

Yet would need to chop the normal CUDA cores to multiple separate chips as we get to smaller nodes. It won't make sense but if Nvidia attempts something like that be ready to see how it flops.

What?
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.

What you didn't understand? CUDA cores need to be chopped to MCM also. So according to your post, you believe Nvidia can make an MCM GPU of which multiple dies of CUDA are interconnected with multiple dies of Tensor & RT cores? It would be too complex to work.

Only unified CUDA cores doing the whole job would be able to work on MCM like RDNA2 does.
 
Man of Honour
Joined
13 Oct 2006
Posts
90,821
What if the implementation changes to not require RT cores/utilise RT cores therefore making them redundant?

The meat of ray tracing is always going to come down to massive amounts of the same relatively simple calculations so however it changes you can still accelerate that bit with just some changes at software level. Sadly I'm not an expert enough to actually explain it in detail to someone else but I do know enough to know a lot of the stuff posted in these threads on ray tracing is utterly wrong and/or misunderstanding.

What you didn't understand? CUDA cores need to be chopped to MCM also. So according to your post, you believe Nvidia can make an MCM GPU of which multiple dies of CUDA are interconnected with multiple dies of Tensor & RT cores? It would be too complex to work.

Only unified CUDA cores doing the whole job would be able to work on MCM like RDNA2 does.

RT cores are only hanging off the SMs in Turing due to convenience/doesn't require a major redesign and to "cheat" a bit to get performance up to usable levels this generation (current ray tracing features even in games like Quake 2 that use it in place of all lighting, etc. aren't pure ray tracing despite being most of the way there) - it doesn't need to be that way and won't stay that way in the future.

MCM doesn't require CUDA cores to be chopped (in the way you are implying) there are many different approaches to utilising an MCM approach as I've tried to say many times in these forums but people still seem to be clinging to the idea that it will be something like Zen. (That isn't to say MCM designs won't spread out CUDA cores but that is another matter).
 
Last edited:
Permabanned
Joined
22 Oct 2018
Posts
2,451
So..your stating that rtx-dlss-does-not-work-this-gen?

I think it has limited appeal. With a high res monitor Ray Tracing lacks power at the moment. DLSS, well, the times I have used it, it showed a massive performance increase (huge increase in fps) but degraded the quality. Even so, its a useful "quick adjustment" if your system can't handle a game. From what I heard, the main body of people who thought it was great were people using 4K TV's. So they tended not to notice the degradation of picture quality as much as someone who is two feet from a high res monitor but certainly did appreciate the boost in fps. I can honestly say that since I bought an RTX card I have used neither, apart from a test of each that lasted about ten minutes.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
No way will NVidia be dropping Tensor cores anytime soon. They are pushing forward with Raytracing and deep learning and it is a good thing. I am sure it will change guise at some point but with such a new tech, it is the best way forward for now. Think of it like the old PhysX dedicated card but on the GPU instead of needing a separate card.
 
Soldato
Joined
3 Jan 2006
Posts
24,945
Location
Chadderton, Oldham
I think Ray Tracing performance isn't bad, I have BF V maxed on my laptop with no DLSS at 1080P and it runs smooth as butter, given it's only 1080P, but this is a 2070MQ we're taking about not a 2080Ti.
 
Soldato
Joined
9 Nov 2009
Posts
24,769
Location
Planet Earth
If a new crisis type game with ground breaking tech came out, everyone would cry at how crap it is because they couldn't run it max settings at 8k

Except people complained back then too though when Crysis was released.

However,the biggest difference we had the 8800GT which released at less than half the price of the 8800GTX,launched 12 months earlier,which gave 85% to 90% of the performance of that graphics card. The 8800GT would be the equivalent of a RTX2060 being 85% to 90% of the RTX2080TI/Titan RTX at just under £300 in todays money. This is why it is considered such a legendary card and survived so long via rebrands. I had the slightly quicker 8800GTS 512MB myself.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
10,049
Except people complained back then too though,but the biggest difference we had the 8800GT which released at less than half the price of the 8800GTX,launched 12 months earlier,which gave 85% to 90% of the performance of that graphics card. The 8800GT would be the equivalent of a RTX2060 being 85% to 90% of the RTX2080TI/Titan RTX at just under £300 in todays money.

That just under £300 card would be in my machine the day after release if it came around. More chance of winning the lottery these days though.
 
Soldato
Joined
28 May 2007
Posts
10,049
Its a shame,since after the R9 290 and GTX970 things have definitely slowed down for the mainstream and lower end enthusiast areas.

Yea am pretty much stuck on my Vega 64 atm as it still competes well with anything up to the £400 mark 5700xt. Hopefully this year the £400-450 market gets a decent boost as that's around the max i would be willing to spend. I would prefer to spend less but in todays market i know that's probably what i would need to spend to get a decent boost.
 
Soldato
Joined
9 Nov 2009
Posts
24,769
Location
Planet Earth
Yea am pretty much stuck on my Vega 64 atm as it still competes well with anything up to the £400 mark 5700xt. Hopefully this year the £400-450 market gets a decent boost as that's around the max i would be willing to spend. I would prefer to spend less but in todays market i know that's probably what i would need to spend to get a decent boost.

AMD is no better nowadays,they just do the minimum to pip Nvidia,but as shown by the RX5500XT,Nvidia can end up with better value products. Best to hang on to what you have and just upgrade less frequently.Then at least you won't get a pathetic boost in performance and act as a beta tester.
 
Soldato
Joined
28 May 2007
Posts
10,049
AMD is no better nowadays,they just do the minimum to pip Nvidia,but as shown by the RX5500XT,Nvidia can end up with better value products. Best to hang on to what you have and just upgrade less frequently.Then at least you won't get a pathetic boost in performance and act as a beta tester.

Yea the 5500xt is a perplexing release and yea AMD just price accordingly to what Nvidia have on the market, no real products to upset the status quo. I like value for my money so that's exactly what i am doing. Feeling the urge due to gaming at 4k but my wallet is more important to me than more fps.
 
Soldato
Joined
18 Feb 2015
Posts
6,480
Will this tech be retroactively implemented for some older games as well? Would love to use the upscaling in some well-known performance hoggers. Right now, as far as I understand it - supporting DLSS still involves a fair amount of legwork for the game developer?

DLSS has to be properly integrated into a game. It can’t be automatically applied after a game is created. So someone that understood the source code and had access to it would have to do some work with old games if they wanted to integrate DLSS.

Hmm, I guess DLSS won't really be more than a rare addition. That's just my opinion extrapolating from this.
 
Soldato
Joined
21 Jul 2005
Posts
19,982
Location
Officially least sunny location -Ronskistats
Yea the 5500xt is a perplexing release and yea AMD just price accordingly to what Nvidia have on the market, no real products to upset the status quo. I like value for my money so that's exactly what i am doing. Feeling the urge due to gaming at 4k but my wallet is more important to me than more fps.

I dont see the point in the 5500xt now we can see how the 5600xt performs. They could have used the weaker silicon and badged it the standard 5600 as the entry card. AMD are guilty of copying nvidia by throwing loads of confusing version to buy instead of keeping it simple and nice pricing tiers.
 
Soldato
Joined
20 Jun 2011
Posts
3,673
Location
Livingston
The way I see it, there is nothing revolutionary about taking hardware from the last century, polishing it, then re-launching it with a new feature set most of us either can't use nor care about.

Until capable 4K cards are within reach of the majority, GTF Nvidia and AMD.

If both insist on adding features which cannot be used by the majority of owners, it may as well be in the form of raw performance which can be tapped into at a later date.

But that wouldn't be good for business? Well, bolt.. my mind isn't wired to be sympathetic to capitalism.
 
Back
Top Bottom