• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080ti or 6900xt for Battlefield 2042 and warzone ?

Associate
Joined
17 Dec 2006
Posts
9
Which card would be better for Battlefield 2042 and warzone out of the above cards. I have an
Alienware AW3418DW 34 Inch WQHD (3440 x 1440)
and
AUROS FI27Q-X 1440P 240HZ MONITOR
Ive had nvidia cards for the last 13yrs at least and i can see the performance in most games seem to be better on the 6900xt but are there any problems with amd drivers or other problems with these amd cards or is it all good now ?
 
So out of these brands and models is one of them better than the other ? Apart from the boost clock speed which is not going to make enough of a difference between those 3 is one brand better than another ?
Gigabyte Radeon RX 6900XT 16GB AORUS MASTER
PowerColor Radeon RX 6900XT Red Devil
MSI Radeon RX 6900 XT GAMING Z TRIO

 
So out of these brands and models is one of them better than the other ? Apart from the boost clock speed which is not going to make enough of a difference between those 3 is one brand better than another ?
Gigabyte Radeon RX 6900XT 16GB AORUS MASTER
PowerColor Radeon RX 6900XT Red Devil
MSI Radeon RX 6900 XT GAMING Z TRIO
Watch this from start to finish, should help you decide.

I'd go with the MSI personally, but they are all good.

EDIT

If you are using a 6700K you will need to upgrade it to a Ryzen 5000 series to unlock the full potential of the 6900 XT. That much GPU horsepower needs a CPU with high IPC.
 
Last edited:
6900XT

Battlefield 2042 will be largely CPU bound and most Zen 3 CPUs will struggle to push locked 120 FPS with NV GPUs due to %25 extra CPU overhead

6900xt will probably lock to 144 with a 5800x while a 3080ti will not be able to, often dropping to 100-110 FPS CPU bound due to software scheduling overhead

Mark my words. This will happen.
 
6900XT

Battlefield 2042 will be largely CPU bound and most Zen 3 CPUs will struggle to push locked 120 FPS with NV GPUs due to %25 extra CPU overhead

6900xt will probably lock to 144 with a 5800x while a 3080ti will not be able to, often dropping to 100-110 FPS CPU bound due to software scheduling overhead

Mark my words. This will happen.


Are there any games where this happens with a 5800x right now?
 
Are there any games where this happens with a 5800x right now?
OP is using a 6700K according to his sig, looking at bottleneck city.
Which card would be better for Battlefield 2042 and warzone out of the above cards. I have an
Alienware AW3418DW 34 Inch WQHD (3440 x 1440)
and
AUROS FI27Q-X 1440P 240HZ MONITOR
Ive had nvidia cards for the last 13yrs at least and i can see the performance in most games seem to be better on the 6900xt but are there any problems with amd drivers or other problems with these amd cards or is it all good now ?
Did some digging and to give you an idea how much faster a 6900 XT is vs a 3080 TI in Call of Duty Black Ops Cold War, 1440P Ultra Settings have a look at the two videos below.

They are almost identical in terms of settings, 3080 TI has motion blur on but 6900 XT has a much wider FOV, so looks a decent comparison. Both GPUs are overclocked.

6900 XT

3080 TI

All the while drawing 100W less. :D
 
Last edited:
Are there any games where this happens with a 5800x right now?

i don't have any data on 5800x but it happens on a 5600x. i'm fairly sure it would happen on a 5800x in warzone. wish they did more comprehensive tests but hb unboxed are probably afraid of Nvidia ambargoing them even further so they're scared of investigating overhead problem on higher end CPUs. Nvidia would've been exposed more and it would apparently create problems for them (or push them to do actual software work, since they're actually capable. %20-25 cpu overhead is simply too much. it can mean 3-4 generations of CPUs)

52y9JtV.jpg




Honestly, %20-25 CPU power advantage is nothing to scoff at. For high refresh rate gaming, Nvidia is "doomed" in my eyes unless they fix their scheduler. They get by for now, because current games are designed to hit rock solid 144 FPS with 2-3 year old CPUs (not in the case of Warzone)

See the video;

https://youtu.be/IpPQKK0X_Is?t=57

5800x getting cpu bound near 150-180 FPS. That's a dangerous number right there. BF 2042 will be much, much tougher on CPU with its 64vs64 battles. It will nowhere near be Warzone in terms of CPU load. Remember how BF 5 redefined how people looked at CPUs.

That %20-25 extra CPU overhead will be muchly appreciated by gamers once nextgen games start to hammer CPUs even further
 
i don't have any data on 5800x but it happens on a 5600x. i'm fairly sure it would happen on a 5800x in warzone. wish they did more comprehensive tests but hb unboxed are probably afraid of Nvidia ambargoing them even further so they're scared of investigating overhead problem on higher end CPUs. Nvidia would've been exposed more and it would apparently create problems for them (or push them to do actual software work, since they're actually capable. %20-25 cpu overhead is simply too much. it can mean 3-4 generations of CPUs)
If you look at the videos above, I do see a few instances where the 3080 TI GPU utilisation drops low, and according to the data presented via the OSD, the 0.1% lows look to be bad.

6900 XT is locked at 96-97% GPU utilisation and does not drop below that.

Perhaps there could be the CPU problems you were looking for a 5800X?

Either way though, that won't account for the large performance difference (avg FPS) between the GPUs.
 
Yeah but we need actual videos where they do side by sides in same location, similar CPULow bound competitive settings

Sadly, I couldn't find any

I will keep looking and post it here if I find any

you do have a point though, %1 lows are worse when the CPU at its limit. and nvidia overhead definetely doesn't help

in most AAA games, i'm certain that gpu will be bound, but in games like warzone, bf 2042 (future) people like to play with CPU bound settings. so that's problematic, highly
 
Yeah but we need actual videos where they do side by sides in same location, similar CPULow bound competitive settings

Sadly, I couldn't find any

I will keep looking and post it here if I find any

you do have a point though, %1 lows are worse when the CPU at its limit. and nvidia overhead definetely doesn't help

in most AAA games, i'm certain that gpu will be bound, but in games like warzone, bf 2042 (future) people like to play with CPU bound settings. so that's problematic, highly
I could record the same level, with half my cores disabled. I've done it before and the FPS does not change.

EDIT - Actually forgot I recorded a 12v12 game with half the cores disabled, was running at 4K max settings though not 1440P.

Side by side? I don't own a 3080 TI, would never buy one and even if i wanted to, I'd have more chance of winning the euromillions. :D

Nontheless, I believe it shows the behaviour you spoke of.
 
A 3080 ti is pretty much a 3090.



Don't forget with cod, you got dlss too.

Also, it depends, if you want ray tracing on, then 3080 ti would without a doubt be better, although I would go for a 3090 over the 3080ti or save your cash and get a 3080 FE. All depends on what price you can get the gpus for.


It's a bit silly trying to guess what bf 2042 will require given it's not out and also looks to be potentially sponsored by nvidia thus:

- will have dlss and possibly no fsr
- probably be heavy on the ray traced effects and if so, amd will probably tank in perf. as they do in cold war with ray tracing
 
A 3080 ti is pretty much a 3090.


Don't forget with cod, you got dlss too.

Also, it depends, if you want ray tracing on, then 3080 ti would without a doubt be better, although I would go for a 3090 over the 3080ti or save your cash and get a 3080 FE. All depends on what price you can get the gpus for.
DLSS and Ray Tracing is not used by anyone with a modicum of intelligence in competitive shooters mate due to latency issues they add.

DLSS adds blur (worse visibility), DLSS adds latency, reports from end users in Warzone using DLSS. And RT increases latency due to FPS hit. Do your research. :p
 
Last edited:
DLSS and Ray Tracing is not used by anyone with a modicum of intelligence in competitive shooters mate due to latency issues they add. Do your research. :p

Not tried it with COD but no latency issues in bf 5 with dlss and ray tracing on here (144hz monitor)

Got some articles/videos to show the supposedly increased latency?

As per battle nonsense (expert with latency/input lag on youtube) The only way there would be "increased latency" with ray tracing on is because the frame rate is lower thus higher latency and this is why dlss @100+ fps would better than no dlss at 60 fps
 
Not tried it with COD but no latency issues in bf 5 with dlss and ray tracing on here (144hz monitor)

Got some articles/videos to show the supposedly increased latency?

As per battle nonsense (expert with latency/input lag on youtube) The only way there would be "increased latency" with ray tracing on is because the frame rate is lower thus higher latency and this is why dlss @100+ fps would better than no dlss at 60 fps
No, just user reports of added latency and bluriness. If you do your research you will find it.
 
Last edited:
No, just user reports of added latency and bluriness. If you do your research you will find it.

i.e. another case of people taking some some randoms user error with no actual substance to back up the claim ;) :cry:

Had a quick google but can't see anything myself. Got links?

Seems like another case of people getting motion/ghosting mixed up with input lag.... The ghosting is nothing to do with input lag/latency..... Have a read about pixel response time and input lag (although the ghosting caused by dlss is nothing to do with the display/panel either). TFTcentral are very good.

PS. User reports seem fine from people who actually have tried it

https://www.reddit.com/r/nvidia/comments/jtfvw7/dlss_on_cod_cold_war_in_amazing/

And with 2.2 dlss, the ghosting issues are reduced:

https://www.dsogaming.com/news/rain...n-be-used-in-older-games-to-reduce-artifacts/

This something you seen on your amd sub reddit by any chance???
 
Back
Top Bottom