• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The real fine wine

There is a part in the park, a spot where you pretty much have no reason to go in the game - I think there is like one minor mission vaguely nearby - where RT works pretty well with indirect lighting, etc. but 99% of players won't even go near there probably LOL. And another place in Pacifica where there are some steps into a little alley where again the difference between RT on and off, especially in motion, is huge - but it is one tiny spot.

I think something more importantly here, most gamers actually don't know what are advance graphics settings are and what they do so going off to find spots like that is nothing they will ever do cos they don't know what it is, people don't even know what SSAO is and where its used and why 'ultra' setting for that is pointless in a lot of action games in motion.

Most gamers don't know what graphics settings do, depth of field, chromatic aberration and vignette ive seen a lot of in games recently and funny enough i dislike these in games, mean the last 2 are seen as bad things in photography but these settings people turn on and wonder why their fps tanks and they prob have no idea what they do or if they actually think it looks any nicer.

The issue here is gamers can be fooled to think something looks good.
 
I thought people payed good money for high end GPUs for both visual and performance, why are people so for a feature that downgrades the visual clarity? (dlss does exactly that to improve performance)

If you are fine with a degrade in visual clarity to gain performance then why do people want high end gpus in the first place, seems like most people will be happy with a lower tier card.

Because if you use ray tracing/rtx, which depending on the game/area, vastly improves IQ but the fps drops to unplayable levels...... However, with DLSS, said games are "playable". The difference in FPS outweighs the drop in IQ imo, again within reason though..... If I'm already getting a constant 60/70+ fps without dlss then I won't use DLSS.

Most people honestly don't notice certain effects whilst a game is in motion, what they will notice is stuttering and large variance of image shifting in motion, people say they can't tell the difference between native and DLSS, well don't take a screenshot or look at screenshots, what people need to do is play said game at 4K, run the whole scene and then run the whole scene again with DLSS, you will see a difference.

Variance whilst in motion is something people do notice when it comes to degradation.

I agree with the motion clarity, but there is no stuttering with dlss.

What is annoying - a lot of the game RT on or off aside from reflections isn't that big a deal but there are the odds spots where RT makes a massive difference - but they are like 1% of the game.

That's the problem with RT for the most part, to get it to really stand out they have to go really overboard and then it just looks like it was done to pimp off the effect and probably detracts from the games overall look.

Yup, I did a lot of comparisons between RTX and no RTX for cyberpunk here:

a2iRSKn.jpg


acF8Jf5.jpg


X60KPp2.jpg


87PIjYF.jpg


0snP5b9.jpg


fn5Mu3I.jpg




aOPh5Hw.jpg


tUnvwWD.jpg


FQH7cZA.jpg


IuXPlAk.jpg


c2sfZlp.jpg


H8Bcr0V.jpg


3gqQpgM.jpg


ewsQTgs.jpg


KdOzQV1.jpg


zzUPE99.jpg


nH1IKOg.jpg


hZCk2dX.jpg


cxRkde4.jpg


u60D8CG.jpg


8tSz8Vq.jpg


AOSqR21.jpg


B0DxKYj.jpg


La7LGzx.jpg


1v5JzFf.jpg


mkPBA29.jpg


UFDQkRn.jpg


sABQuHI.jpg


sqAeEQe.jpg


xSXZ93M.png


HNQHHg3.jpg


0HIcyuJ.jpg


0JAUJU9.jpg


ra55RNj.jpg

90% of the time, it is nigh on pointless but...... in the areas where you do notice it, it is pretty stunning, it's nice not having reflections just vanish right before your eyes just because you change your camera angle.....

I would love to know how much nvidia are applying certain parts of the tech. more heavily than what is required just to cripple AMDs cards in ray tracing i.e. like they did with tesselation in crysis 2 and hairworks tessellation in the witcher 3......

I think both look really good but if I do really look and really look I needed to do I would say bottom is native and top is DLSS and the only reason I say that is from the bottom pictures. DLSS is something I really hope Microsoft and AMD can get right but if I am being honest I dont think RDNA 2 will get the feature I think it will come with next gen RDNA 3 later this year or next year.

Yup, my money is on RDNA 3 too, I also have a feeling that whatever amd come up with will be applicable to more games going by past history.....
 
Because if you use ray tracing/rtx, which depending on the game/area, vastly improves IQ but the fps drops to unplayable levels...... However, with DLSS, said games are "playable". The difference in FPS outweighs the drop in IQ imo



I agree with the motion clarity, but there is no stuttering with dlss.

Well thats the point I was making, if people are happy to gain performance at the cost if IQ, then lowering settings or resolution and using different forms of AA will do that, so a lower end GPU will be sufficient for this use case.

I just mean stuttering in general is something people will notice regardless, isn't DLSS specific, but anything in motion is where something with large degrees of variance whilst in motion is something people will notice.
 
Well thats the point I was making, if people are happy to gain performance at the cost if IQ, then lowering settings or resolution and using different forms of AA will do that, so a lower end GPU will be sufficient for this use case.

I just mean stuttering in general is something people will notice regardless, isn't DLSS specific, but anything in motion is where something with large degrees of variance whilst in motion is something people will notice.

Lowering settings is good and always something I do as generally "ultra" brings literally nothing over "high/very high" settings (especially shadows, draw distance) but beyond that, dropping settings any further to medium/low affects quality massively particularly around effects, particle and lighting + textures.

Lowering resolution on a display without a scaler (or a **** one) massively reduces IQ i.e. TVs are grand if using anything but native resolution but monitors are awful at anything but native res. e.g., this is where DLSS trumps all.

AA isn't the same as lowering/increasing resolution either i.e. a lot of games can still have jaggies, even on 4k panels @27/32" Personally I rather have the likes of TAA even if it does blur the image a bit, I despise jaggies and find it really off-putting
 
Yup, I did a lot of comparisons between RTX and no RTX for cyberpunk here:

Unfortunately I don't have before and after comparisons and screenshots don't always show it best but some areas of the game RT does a lot more than others:

1tAYKT1.jpg


7pU1n2L.jpg


dA6TVhf.jpg
 
I would but on youtube it is compressed quality so i can't see how good it is. Or do you think youtube will only compress DLSS videos when they look bad? :)

Videos are just to educate you. Try taking some in game screen shots. Wait, didn't I alreay say that....

Off course you can, the higher FPS gives it away. You are using a screen shot of a compressed Youtube video. Instead, take the pics during gameplay. It looks and runs great at 1440p with ray tracing maxed and DLSS on a 3080.
 
I think something more importantly here, most gamers actually don't know what are advance graphics settings are and what they do so going off to find spots like that is nothing they will ever do cos they don't know what it is, people don't even know what SSAO is and where its used and why 'ultra' setting for that is pointless in a lot of action games in motion.

Most gamers don't know what graphics settings do, depth of field, chromatic aberration and vignette ive seen a lot of in games recently and funny enough i dislike these in games, mean the last 2 are seen as bad things in photography but these settings people turn on and wonder why their fps tanks and they prob have no idea what they do or if they actually think it looks any nicer.

The issue here is gamers can be fooled to think something looks good.


You just summarized.......Console gamers :) (Coming to PC)

And if they'd never come across to PC - they'd have never heard of this terminology and the rabbit hole that is - 'seeing it'. And then once you've seen that.....PC GAMER

wallet = empty
 
Last edited:
You just summarized.......Console gamers :) (Coming to PC)

And if they'd never come across to PC - they'd have never heard of this terminology and the rabbit hole that is - 'seeing it'. And then once you've seen that.....PC GAMER

wallet = empty

I wish this was the case but honestly I am going to use WoW for an example, so many people used the 'Nvidia experience settings' or whatever its called and it automatically pushed resolution scale to 200% and people are like why is my fps at 5 FPS, they couldnt figure it out and no idea what that setting did.

Most people even PC gamers don't know anything about how to configure PC parts and in game settings, hence why pre-sets exist, but yes, once you understand it and want to take advantage of it, that wallet gets empty fast XD.
 
If only firestrike could be made into a game AMD 6900xt might actually beat the 3070.

AMD really need a DLSS equivalent and fast, otherwise it's only going to get worse.

They dont really need anything cos their cards sell as fast as they can make them. Its good that AMD have approached the high end because it forced Nvidias hand in bringing the 3080fe out at a strong price point. Doesn't really matter that they aren't winning benchmarks.
 
They dont really need anything cos their cards sell as fast as they can make them. Its good that AMD have approached the high end because it forced Nvidias hand in bringing the 3080fe out at a strong price point. Doesn't really matter that they aren't winning benchmarks.

With the current crypto , lockdown , covid, crazy world , all the gfx cards are currently selling out. Hardly a win.

On the second point, 3080fe, 3070fe were priced the way they were because of next gen consoles which were coming in at £450 not because of big Navi.
 
Oh dear, AMDs DLSS equivalent, may not work on the 6 series.

Go to 2:11.

This is great, now we have Machine Learning Accelerators, even better than tensor cores. :D
What these things are doing is to run some matrix maths called tensor. And they are doing very fast but they are not the only way to run this type of maths. It can even be done on the CPU (at a much lower speed). Or on the shader cores in the GPU.
Unlike a GPU chip, adding dedicated hardware to a chiplet is a logical step. But unless Direct ML becomes a standard, AMD can put thousands of Machine Learning Accelerators on their boards and they will be useless. With Direct ML such hardware can be used for faster upscaling and denoising.

Gamer Meld is stupid, the 6000 series can also do upscaling but it won't do it as fast without "Machine Learning Accelerators". :)
https://www.techspot.com/article/2151-nvidia-ampere-vs-amd-rdna2/
"As previously mentioned, the Compute Units in RDNA 2 now support more data types; the most notable inclusions are the low precision data types such as INT4 and INT8. These are used for tensor operations in machine learning algorithms and while AMD have a separate architecture (CDNA) for AI and data centers, this update is for use with DirectML.
This API is a recent addition to Microsoft's DirectX 12 family and the combination of hardware and software will provide better acceleration for denoising in ray tracing and temporal upscaling algorithms. In the case of the latter, Nvidia have their own, of course, called DLSS. Their system uses the Tensor Cores in the SM to perform part of the calculations, but given that a similar process can be constructed via DirectML, it might seem that these units are somewhat redundant. However, in both Turing and Ampere, the Tensor Cores also handle all math operations involving FP16 data formats.
With RDNA 2, such calculations are done using the shader units"
 
With the current crypto , lockdown , covid, crazy world , all the gfx cards are currently selling out. Hardly a win.

On the second point, 3080fe, 3070fe were priced the way they were because of next gen consoles which were coming in at £450 not because of big Navi.

Funny how nvidia seemed to time things around amd big navi reveal then eh.
 
With the current crypto , lockdown , covid, crazy world , all the gfx cards are currently selling out. Hardly a win.

On the second point, 3080fe, 3070fe were priced the way they were because of next gen consoles which were coming in at £450 not because of big Navi.


Sure they were shillbot :rolleyes:.
 
Back
Top Bottom