• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080ti or 6900xt for Battlefield 2042 and warzone ?

i.e. another case of people taking some some randoms user error with no actual substance to back up the claim ;) :cry:

Had a quick google but can't see anything myself. Got links?

Seems like another case of people getting motion/ghosting mixed up with input lag.... The ghosting is nothing to do with input lag/latency..... Have a read about pixel response time and input lag (although the ghosting caused by dlss is nothing to do with the display/panel). TFTcentral are very good.

PS. User reports seem fine from people who actually have tried it

https://www.reddit.com/r/nvidia/comments/jtfvw7/dlss_on_cod_cold_war_in_amazing/

And with 2.2 dlss, the ghosting issues are reduced:

https://www.dsogaming.com/news/rain...n-be-used-in-older-games-to-reduce-artifacts/

This something you seen on your amd sub reddit by any chance???
Here's a few from this very forum from the COD thread.

Expand your search, try YouTube, Reddit and Nvidia's own forums, you'll find a lot more.

DLSS is crap in warzone. Just a blurry mess - really disappointing..
DLSS only really looks good at 4K. I've tried it at 1440P and it always introduced slight blur which puts me off.
tried it this morning @ 1440p res and DLSS on quality, runs smooth as silk but may just be me or my ageing eyes but DLSS still looks like a blurry mess unless heavy amounts of sharpening are added. Never looks a sharp as the videos nvidia put out
I have plenty more, but you get the idea. Don't get me started on RT either in competitve shooters Nexus. :D

I know you like to defend DLSS and RT like your life depends on it, but competitive shooters are not the best light for either technology.

If you have anything of substance, which we both know you won't, send me a private message and we can continue this debate as to not take the OPs thread off topic. :)
 
Ok so where are you getting the higher input lag/latency parts from then????

Again, I'm not and never have disputed the ghosting/motion issue..... You keep saying how it adds more latency but nothing to show case this???? Those posts you linked and the ones I have seen only mention the blur/ghosting.... this is not the same as increased latency/lag.....

Would be interesting to see what people think of the motion/ghosting if they tried 2.2 as per neil post just now:

NVIDIA DLSS 2.2 Quietly Released in Rainbow Six Siege, Can Be Manually Added to Other Games!
https://forums.overclockers.co.uk/t...an-be-manually-added-to-other-games.18931705/

Working fine in Warzone with DLSS 2.2 Mod! - Performance slightly higher, sharper details (less blurry)

EDIT:

And dlss is actually very good as higher frame rate = less input lag/latency
 
Also, I posted more videos show casing why the OP might want to take nvidia over amd unlike your post where you use 2 completely different videos that don't show the whole story..... Might be worthwhile reminding everyone that you work for AMD ;)

If people post incorrect info. expect to be corrected.
 
Ok so where are you getting the higher input lag/latency parts from then????

The videos i posted compare the same game, res, settings and both GPUs are overclocked. So it is a decent comparison to compare performance. Neither video has RT on as no one uses it competitively.

This is what competitive gamers have said about it, I have never tried it so cannot verify but there's no reason to believe they are lying given they own the GPUs and use the tech.

I'll give you a few, if you want more you can put the effort in to look yourself as I don't use it (RT/DLSS) so I don't care and I don't want to spend my evening finding multiple quotes from the users I've seen complaining about said issues.

However, I thought I'd mention it given the OPs ask about which GPU is better for COD and BF2042 at 1440P. For COD it's definitely the 6900 XT, for BF2042 who knows but I have my money on the 6900 XT too.

End of the day the OP can do his own research and draw his own conclusions.

11zXBXN.png


S2zjXTA.png


xE8oSCH.png

@Game Can you add any more colour to the latency issue?

Even if you assume that they are lying about latency and aiming, the blurryiness and difficulty spotting enemies due to DLSS makes enabling it daft.

Which is why I said anyone with a modicum of intelligence.. Of couse then you pop up saying to use it along with RT... :p

As I can see you are going to take this thread off topic and make it about defending DLSS and RT again, I will bow out. If you want to continue, send me a PM.

Sorry to OP for the off topic. :)
 
Last edited:
The 6900XT is an excellent card and super powerful, and most importantly you are likely to get one unlike the 3080Ti! Loved the ones I have had, even though I'm trying a 3080Ti atm I could easily go back to one and be more than happy :cool:
 
The videos i posted compare the same game, res, settings and both GPUs are overclocked. So it is a decent comparison to compare performance. Neither video has RT on as no one uses it competitively.

This is what competitive gamers have said about it, I have never tried it so cannot verify but there's no reason to believe they are lying given they own the GPUs and use the tech.

I'll give you a few, if you want more you can put the effort in to look yourself as I don't use it (RT/DLSS) so I don't care and I don't want to spend my evening finding multiple quotes from the users I've seen complaining about said issues.

However, I thought I'd mention it given the OPs ask about which GPU is better for COD and BF2042 at 1440P. For COD it's definitely the 6900 XT, for BF2042 who knows but I have my money on the 6900 XT too.

End of the day the OP can do his own research and draw his own conclusions.

11zXBXN.png


S2zjXTA.png


xE8oSCH.png

@Game Can you add any more colour to the latency issue?

Even if you assume that they are lying about latency and aiming, the blurryiness and difficulty spotting enemies due to DLSS makes enabling it daft.

Which is why I said anyone with a modicum of intelligence.. Of couse then you pop up saying to use it along with RT... :p

As I can see you are going to take this thread off topic and make it about defending DLSS and RT again, I will bow out. If you want to continue, send me a PM.

Sorry to OP for the off topic. :)

I've got not clue who those people are and where is the proof backing it up? Are we really still at this stage where people are taking one liners as factual statements? Have you not learned after things like HZD, godfall vram/RT performance issues got debunked....

Do you not think if latency was added that sites such as HU would be picking it up? And people who specifically test input lag such as battle nonsense would also mention it and show his results????

Like I said, a lot of people get confused between motion/ghosting and latency when it comes to monitors..... And it seems the same happens with dlss. Of course you also will want the best motion clarity there is for competitive FPS but at the same time, higher fps is better for less latency.

If dlss does add input lag, I got no issues with that if there is actual evidence to back it up as there is for the ghosting/motion issues....

And did I say for the OP to use it???? Nope, I simply am challenging your incorrect assumptions once again. Also, believe it or not, there are people who like to play online shooters with maxed graphics, some just enjoy the visuals over topping a scoreboard, played with plenty of people like this in bf.
 
If dlss does add input lag, I got no issues with that if there is actual evidence to back it up as there is for the ghosting/motion issues....

I've seen some tests where DLSS adds around 1.5ms of latency at 4k when comparisons were made to FSR, whereas FSR was 0.4ms. I'll try to find the link.
 
Near as makes no difference performance wise between the two cards. Whichever you can get or cheapest.

Both monster cards for 1440p, you probably wont use DLSS or FSR @ 1440p as these cards churn out good FPS at this resolution.

To bring the rest of the PC up to scratch I'd probably plump for at least an AMD 5600X CPU with some 3600Mhz DDR4, and the MSI X570 Tomahawk is a well priced MOBO for the features. An 850W PSU is considered the minimum when running these power hungry cards.
 
I've seen some tests where DLSS adds around 1.5ms of latency at 4k when comparisons were made to FSR, whereas FSR was 0.4ms. I'll try to find the link.

Is it like for like i.e. where fps are locked?

Think I recall of seeing that maybe too (might have been amds marketing? if so, I'll take with a pinch of salt :p), but this here is the caveat even if there is extremely minor input lag added.... dlss/fsr gains you substantially more fps so even if <1ms input lag is added because of the dlss/fsr methods, your latency is still going to be far better with higher FPS than lower FPS so basically, dlss/fsr increased latency is outweighed by the massive increase in fps.

pZd22nq.png



EDIT:

Also, if we are talking about playing competitively, all this is rather pointless then because you'll be putting everything to low/off anyway in which case, both cards will be the exact same....
 
Last edited:
I really really don't see a 5800x having any problems on battlefield 2042 with a Nvidia card but would love to be proven wrong. There is already plenty of CPU bound SIM racing games that prove that point, at this very moment in time the 3080TI is better in both rasterisarion and RT than the 6900XT plus it has DLSS. Same price point Nvidia wins no doubt. AMD card has a couple advantages like better power consumption and thermals.
 
AMD card has a couple advantages like better power consumption and thermals.

This for sure, and also the RDNA2 cards have a lower memory latency overall which probably leads to a smoother and more pleasant experience when it comes to pure rasterisation games at least (have found frametimes to be exceptionally smooth).

They both have their strengths and weaknesses for sure, it's upto AMD now to counter and surpass Nvidia's strength when it comes to the next generation.
 
If it is competitive gaming, FSR will be in theory around 3x faster than DLSS. So FSR will add 1ms to the frame time while DLSS will add 3ms. If initial res + upscaler frame time < native res frame time, then your latency will improve, if it is > native res frame time, your latency will increase.
But this is in theory since FSR is new and it was only added in Dota2 so far. Due to its simple nature of course it will take less time than DLSS, that's why i said no one will use DLSS in competitive games anymore if they add FSR.
 
For Warzone from what I have seen the 6900XT is the better choice and as for new Battlefield - who knows probably doesn't matter :) Personally I think they are both overpriced compared to a 3080/6800XT. I guess it depends what you are willing to pay and whether you want to wait or not.
 
If it is competitive gaming, FSR will be in theory around 3x faster than DLSS. So FSR will add 1ms to the frame time while DLSS will add 3ms. If initial res + upscaler frame time < native res frame time, then your latency will improve, if it is > native res frame time, your latency will increase.
But this is in theory since FSR is new and it was only added in Dota2 so far. Due to its simple nature of course it will take less time than DLSS, that's why i said no one will use DLSS in competitive games anymore if they add FSR.

No competitive gamer should be using fsr or dlss, I know I won't, nor will I use Ray Tracing because it's stupid outside of relaxing single player games. I don't use DLSS, FSR or RT in BFV or Warzone or Cold War and I won't use it in BF2042 or the next CoD, because its stupid in these games.
 
No competitive gamer should be using fsr or dlss, I know I won't, nor will I use Ray Tracing because it's stupid outside of relaxing single player games. I don't use DLSS, FSR or RT in BFV or Warzone or Cold War and I won't use it in BF2042 or the next CoD, because its stupid in these games.
This. I would also disable FSR if it was supported in COD.

My typical competitive settings in COD are everything on lowest, but I have Texture Quality and model quality on Ultra, with the 4K Texture pack installed. These don't have much impact on FPS, but they do improve the quality of the image.
 
This. I would also disable FSR if it was supported in COD.

My typical competitive settings in COD are everything on lowest, but I have Texture Quality and model quality on Ultra, with the 4K Texture pack installed. These don't have much impact on FPS, but they do improve the quality of the image.
nooo!! you gotta max out them visuals, use them rt effects to their max while playing an all out 64 vs 64 battle!! you have to use dlss in tow too!! this is an eye candy game where you're fighting against mindless bots so there's no problem getting 80-100 fps!! who needs locked 144+ fps on a competive conquest shooter amirite?
 
nooo!! you gotta max out them visuals, use them rt effects to their max while playing an all out 64 vs 64 battle!! you have to use dlss in tow too!! this is an eye candy game where you're fighting against mindless bots so there's no problem getting 80-100 fps!! who needs locked 144+ fps on a competive conquest shooter amirite?
Haha stop it, there's enough crazy on display in this thread already. :p

Gonna test that same level today and put some footage up. Curious to see if I can introduce a bottleneck by running 1440P Low settings with half cores disabled so essentilly a 5800x.
 
nooo!! you gotta max out them visuals, use them rt effects to their max while playing an all out 64 vs 64 battle!! you have to use dlss in tow too!! this is an eye candy game where you're fighting against mindless bots so there's no problem getting 80-100 fps!! who needs locked 144+ fps on a competive conquest shooter amirite?

Play with RT on for a couple minutes then turn it off to never use it again sounds like a good plan :D
 
Haha stop it, there's enough crazy on display in this thread already. :p

Gonna test that same level today and put some footage up. Curious to see if I can introduce a bottleneck by running 1440P Low settings with half cores disabled so essentilly a 5800x.
i see a footage you shared, i think that was TDM? tdm has much higher frames than warzone, it is lighter on the cpu side, just a fyi. for comparison, i was getting 120-140 frames on warzone and getting 180-250 frames on tdm. the bigger the map, the more the player count, the harder the game hits the cpu (at this point, game is really badly optimized)

"
your cpu is the bottle neck

my 9900 ks specia lediton is @ 5.2 ghz / 4.9 ring with 32 gb of 4133 mhz tuned tight ram and i can get over 155 - 175 in down town with 1440p ultra , low shadows , ray tracing off"

https://www.reddit.com/r/CODWarzone/comments/o2ycv7/warzone_needs_to_fix_the_cpu_usage_for_zen_2/

i mean look at this dude. he's proud of getting 155 175 fps with a 5.2 ghz allcore + 4133 tuned rams. %90 of these user reports come from nvidia gpu users (that also has something to do with nvidia having a bigger market share). but pretty much, zen 2 is eol for this game if you want to even lock to 120 with an nvidia gpu. amd gpu users seem to be affected less. you practically need a zen 3 chip or amazingly super duper oc'ed intel chip to get consistent 144 fps+ on warzone

its not silly to assume bf 2042 will hit cpus harder than warzone. judging by the trailer, and judging by how bf5 was one of the most cpu demanding game when it was released...
 
i see a footage you shared, i think that was TDM? tdm has much higher frames than warzone, it is lighter on the cpu side, just a fyi. for comparison, i was getting 120-140 frames on warzone and getting 180-250 frames on tdm. the bigger the map, the more the player count, the harder the game hits the cpu (at this point, game is really badly optimized)

"
your cpu is the bottle neck

my 9900 ks specia lediton is @ 5.2 ghz / 4.9 ring with 32 gb of 4133 mhz tuned tight ram and i can get over 155 - 175 in down town with 1440p ultra , low shadows , ray tracing off"

https://www.reddit.com/r/CODWarzone/comments/o2ycv7/warzone_needs_to_fix_the_cpu_usage_for_zen_2/

i mean look at this dude. he's proud of getting 155 175 fps with a 5.2 ghz allcore + 4133 tuned rams. %90 of these user reports come from nvidia gpu users (that also has something to do with nvidia having a bigger market share). but pretty much, zen 2 is eol for this game if you want to even lock to 120 with an nvidia gpu. amd gpu users seem to be affected less. you practically need a zen 3 chip or amazingly super duper oc'ed intel chip to get consistent 144 fps+ on warzone

its not silly to assume bf 2042 will hit cpus harder than warzone. judging by the trailer, and judging by how bf5 was one of the most cpu demanding game when it was released...
Yeah agree with you on Warzone. However the video I posted and comapred to were both running Black Ops. Same game but yeah not quite as heavy on the CPU as Warzone.

Speaking of Warzone, I ran a stock 6900 XT at 1440P on that when i first got my MBA 6900 XT. GPU utilisation locked at 96-97% same as Black Ops, despite all those players. 6900 XT hardware scheduler doing its thing. :)

Agree. I would expect 64vs64 BF2042 to be very heavy on the CPU all those players and CPU calculations required.
 
Back
Top Bottom