• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

at the end of the day dlss is a little mehh on first person shooters causing latency

Not sure what you mean by causing latency? Overall DLSS will reduce latency due to it reducing frame time. It's also worth noting that DLSS 2.1 is now being used in VR, an area that is very sensitive to latency. Nvidia Reflex, where supported, will also reduce input latency.


lets just up the AMD version doesnt do the same

AMD are in a mess of their own making by producing a stripped down budget concious cost and power limited chip for the consoles, RDNA2. This can be seen by the lack luster RT performance which also gives a performance hint of what is to come. At best we can hope that RDNA3 is not held back by MS and Sony.
 
Not sure what you mean by causing latency? Overall DLSS will reduce latency due to it reducing frame time. It's also worth noting that DLSS 2.1 is now being used in VR, an area that is very sensitive to latency. Nvidia Reflex, where supported, will also reduce input latency.




AMD are in a mess of their own making by producing a stripped down budget concious cost and power limited chip for the consoles, RDNA2. This can be seen by the lack luster RT performance which also gives a performance hint of what is to come. At best we can hope that RDNA3 is not held back by MS and Sony.

But, but, but you wasn't saying RTX 2000 series had lack luster performance.

Nvidia first attempt vs AMD first attempt they both on par.

What devs managed to do with consoles will show over the next 5+ years. Just look at what has been achieved in the past on hardware so lacking its amazing what the consoles actually manage to produce.

That new ratchet and clank game is already looking miles better than anything released on PC.

Last of us 2 running on the PS4 Pro on a 4k TV with subpar HDR 400 I was blown away to the point nothing still comes close to it Nothing!

The only negative I can give last gen is performance but I would definitely love to play last of us 2 again on a ps5 and be blown away again.

One thing pc Gamers alway fail at is comparing pc development with console development.

Don't do it you try running GTA 5 on a 512MB combine setup like rockstar did with PS3
 
But, but, but you wasn't saying RTX 2000 series had lack luster performance.

No, I said Turing was garbage at the time. I skipped Turing.

Nvidia first attempt vs AMD first attempt they both on par.

Does it matter what attempt they are on? Most people, myself included, don't care what company is making the product. They just want the best bang for the buck.

What devs managed to do with consoles will show over the next 5+ years. Just look at what has been achieved in the past on hardware so lacking its amazing what the consoles actually manage to produce.

That new ratchet and clank game is already looking miles better than anything released on PC.

Last of us 2 running on the PS4 Pro on a 4k TV with subpar HDR 400 I was blown away to the point nothing still comes close to it Nothing!

The only negative I can give last gen is performance but I would definitely love to play last of us 2 again on a ps5 and be blown away again.

One thing pc Gamers alway fail at is comparing pc development with console development.

Don't do it you try running GTA 5 on a 512MB combine setup like rockstar did with PS3

The last console I owned was the Sega Master System. I really don't care for what is being done with consoles, or mobile for that matter. I do remember high resolution space invaders for the ZX81, but people had moved on to the Spectrum/C64/BBC by the time that was released.
 
Why do I hear Cartman's voice when I read these lackluster comments? Put some effort in, less whine, prove me wrong.

You've already been proved wrong on numerous occasions yet it seems to go flying over your wee nut. Nvidia are ahead on a single metric of which they've had 2 attempts at now.

All your spiel about rdna 2 being a "stripped down budget console chip" is laughable when you consider that amd are ahead in quite a few benchmarks and are trading blows in others.

So putting that into perspective this "stripped down budget console chip" is somehow going toe to toe with nvidias best in 99% of games out there, they fall behind in games that support ray tracing and that's it, a feature that in a lot of instances people can't even tell the difference between it being on or off besides the obvious performance hit.

Yet those facts never seem to enter your little dome, it's all about the single metric where nvidia have an advantage after 2 attempts and nothing else matters.

It really hasn't sunk in for you that when you trot out your usual spiel of "budget console chip" that it looks somewhat derpy when you look at reviews and see amd demonstrably trading blows in benchmarks, winning some and losing some. A "budget chip" vs nvidias best and nvidia need to rely on a single option that comparitively few games support to get any convincing wins in reviews.

If anything it's slightly worrying that you can't see the flaws in your own rhetoric, if anything was considered "budget" I certainly wouldn't expect to lose any benchmarks to it, yet here we are with you trotting out the same spiel time and time again.

How about ceasing the copy and paste bs until fsr comes out and then a more direct comparison can be drawn?
 
You've already been proved wrong on numerous occasions yet it seems to go flying over your wee nut. Nvidia are ahead on a single metric of which they've had 2 attempts at now.

Are you really trying to justify AMD's poor performance by the number of attempts? That's some 'special' thinking right there.

All your spiel about rdna 2 being a "stripped down budget console chip" is laughable when you consider that amd are ahead in quite a few benchmarks and are trading blows in others.

AMD are ahead in some benchmarks way behind in others, that is true. The problem AMD have is that what they are way behind in is what is also considered next gen, that being RT and upsampling.

So putting that into perspective this "stripped down budget console chip" is somehow going toe to toe with nvidias best in 99% of games out there, they fall behind in games that support ray tracing and that's it, a feature that in a lot of instances people can't even tell the difference between it being on or off besides the obvious performance hit.

Putting things in perspective while restricting consideration to just the gamer you can spend the same on a 6800XT as you would on a 3080, yet with the 6800XT get garbage RT, no DLSS and higher latency TODAY.

Yet those facts never seem to enter your little dome, it's all about the single metric where nvidia have an advantage after 2 attempts and nothing else matters.

Okay Cartman, keep crying over the number of attempts, while people look at what is available TODAY.

It really hasn't sunk in for you that when you trot out your usual spiel of "budget console chip" that it looks somewhat derpy when you look at reviews and see amd demonstrably trading blows in benchmarks, winning some and losing some. A "budget chip" vs nvidias best and nvidia need to rely on a single option that comparitively few games support to get any convincing wins in reviews.

If anything it's slightly worrying that you can't see the flaws in your own rhetoric, if anything was considered "budget" I certainly wouldn't expect to lose any benchmarks to it, yet here we are with you trotting out the same spiel time and time again.

How about ceasing the copy and paste bs until fsr comes out and then a more direct comparison can be drawn?

AMD went full out rasterisation on a smaller node with up to 50% higher clock speed when compared to Nvidia yet only compete in the PC with rasterisation. Is that so hard to understand?

Maybe if you spent more time on the tech and less on the insults you wouldn't come across as Cartman.
 
"Cartman" :cry:, think you're the one that's coming across as a bit "special".

Not even gonna bother with the rest of your drivel as its already been argued before and smacked down before. The fact that you're going on about clock speeds like that's even comparable between architectures says it all. Keep on living in lala land captain delusional.
 
"Cartman" :cry:, think you're the one that's coming across as a bit "special".

Not even gonna bother with the rest of your drivel as its already been argued before and smacked down before. The fact that you're going on about clock speeds like that's even comparable between architectures says it all. Keep on living in lala land captain delusional.

There it is, 'screw you guys, I'm going home' :cry::cry::cry:
 
There it is, 'screw you guys, I'm going home' :cry::cry::cry:
Shifting the focus from your point and you being pretty wrong only makes you look a bit silly. :p by your own comments 3000 is also striped down, if fsr puts 6000 and 3000 on a level playing field from a RT pov your comment will look even more foolish. Why not just wait and comment after we've seen it.. instead of spouting rubbish time and time again. Again what is it about GPUs that makes people go crazy...
 
There it is, 'screw you guys, I'm going home' :cry::cry::cry:

Or more like "I'd get more sense talking to the wall than this delusional fanboy living in his own little world". Your bs has been debunked numerous times yet it never sinks in, which is a problem on your end, nobody else's. You continue to regurgitate same spiel over and over in multiple threads.

Dunno If I'd call that "special", deluded or just plain sad. Continue to peddle your p1sh, I'll be sticking you on ignore as it's patently obvious that it's pointless talking to someone that can't accept anything out of his own little narrow point of view. Yet another person that buys a gpu and turns into a shilling cult member.
 
Last edited:
Shifting the focus from your point and you being pretty wrong only makes you look a bit silly. :p by your own comments 3000 is also striped down, if fsr puts 6000 and 3000 on a level playing field from a RT pov your comment will look even more foolish. Why not just wait and comment after we've seen it.. instead of spouting rubbish time and time again. Again what is it about GPUs that makes people go crazy...


For most people a gpu is a piece of hardware that runs games, for others it's buying into a cult that must be defended online at all costs. Scientology could take some lessons from some of the people on hardware forums.

I've zero preference for either company, I just want a card to run games, yet some people seem to get it into their heads that when they buy a card they're contractually bound to become a dribbling shill spouting their cards perceived "superiority" as loudly as possible.

It's a bit sad to say the least. It would make more sense if these people were getting the cards for free to incentivise their dribble, but nope, they hand over several hundred quid, then act like the manufacturer is their "friend" and go all out shill mode. Have never managed to figure that one out.:confused:
 
Shifting the focus from your point and you being pretty wrong only makes you look a bit silly. :p

I didn't shift the focus. I asked to be proven wrong, which he failed to do. Instead he appeared to run off in a fanboy tantrum, hence - There it is, 'screw you guys, I'm going home' :cry::cry::cry: Perhaps you missed the show back in the day?

by your own comments 3000 is also striped down, if fsr puts 6000 and 3000 on a level playing field from a RT pov your comment will look even more foolish.

FSR has nothing to do with RT. Did you mean DLSS?, which incidently also has nothing to do with RT. The problem AMD have is that they have to run FSR across the same cores that they are doing everything else. In comparisson Nvidia runs DLSS on the tensor cores, dedicated hardware that does not interfere with the other tasks the GPU has to perform.

Why not just wait and comment after we've seen it.. instead of spouting rubbish time and time again.

I originally commented on latency and how DLSS should reduce it, not increase it. That had nothing to do with AMD's attempts to compete with DLSS.

Again what is it about GPUs that makes people go crazy...

You are confused again, fanboys throwing tantrums is not the same as people going crazy.
 
Or more like "I'd get more sense talking to the wall than this delusional fanboy living in his own little world". Your bs has been debunked numerous times yet it never sinks in, which is a problem on your end, nobody else's. You continue to regurgitate same spiel over and over in multiple threads.

Dunno I'd I'd call that "special", deluded or just plain sad. Continue to peddle your p1sh, I'll be sticking you on ignore as it's patently obvious that it's pointless talking to someone that can't accept anything out of his own little narrow point of view. Yet another person that buys a gpu and turns into a shilling cult member.

Rather than throw tantrums, prove me wrong? :cry::cry::cry:
 
I didn't shift the focus. I asked to be proven wrong, which he failed to do. Instead he appeared to run off in a fanboy tantrum, hence - There it is, 'screw you guys, I'm going home' :cry::cry::cry: Perhaps you missed the show back in the day?



FSR has nothing to do with RT. Did you mean DLSS?, which incidently also has nothing to do with RT. The problem AMD have is that they have to run FSR across the same cores that they are doing everything else. In comparisson Nvidia runs DLSS on the tensor cores, dedicated hardware that does not interfere with the other tasks the GPU has to perform.



I originally commented on latency and how DLSS should reduce it, not increase it. That had nothing to do with AMD's attempts to compete with DLSS.



You are confused again, fanboys throwing tantrums is not the same as people going crazy.
No I meant RT. Since the only reason for dlss/fsr realistically is for extra RT performance. Or do we really need 300plus FPS on most titles with dlss without RT? Your talking about how console or amd cards are stripped down what has that got to to do with dlss/fsr?
 
Last edited:
No I meant RT. Since the only reason for dlss realistically is for extra RT performance. Or do we really need 300plus FPS on most titles with dlss without RT? Your talking about how console or amd cards are stripped down what has that got to to do with dlss?


Ignore him or you'll be going round in circles for hours.
 
Back
Top Bottom