• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.

I always enjoy those Digital Foundry videos, "If you zoom in 10x you can see at ultra settings' that the edge of this shadow is a lot more diffuse than at lower settings". Brilliant, that shadow that I ran past at top speed to avoid a hail of bullets?
 
I always enjoy those Digital Foundry videos, "If you zoom in 10x you can see at ultra settings' that the edge of this shadow is a lot more diffuse than at lower settings". Brilliant, that shadow that I ran past at top speed to avoid a hail of bullets?

Correct. There is not many games apart from exploring open worlds where you look around at slow pace. Even GoW and beautiful engines, you have to fight npc's or solve puzzles etc. so your taking in some quality just no need for ultra. If the gpu can handle the ultra and its still way above 60fps then by all means run it full crank.
 
series x runs Valhalla at 4k 30fps - settings have not been published but that is what the games runs at.

Sounds about right anyway, the PC recommended specs for 4k 30fps is a RTX2080 and recommended for 4k 60fps is a RTX3080
Not sure where you've gotten that from, it's 4k/60 on series X. A quick Google search will confirm.

Will be interesting to see what GPU will be needed to match that performance and image quality.
 
@Hedge Depends on the game, for example I find there's a very notable difference in MSFS2020 between high and ultra, especially when it comes to draw distance.

Fast paced games yeah, definitely not so much.
 
it says 350w mate above your post

Yes mate, and the article linked with actual measurements says 128w while gaming which is what I was discussing. The cap Humbug took was from someone replying to the tweet with the 128w figure, I provided the actual article.

It seems ridiculously low, but was what they actually measured with a Series X in hand vs some random tweeter saying 300/350w. I was wondering if the game might have been frame capped being last gen, because it just seems incredibly low.

Look at the article I linked (just translate from german).
 
what do you mean?

it says 350w mate above your post

lol, can't tell if serious :D

I don't know why people keep speculating about the Series X power consumption, its been known long before Digital Foundry published their quite obviously nonsense content on it.

It has a 200 Watt TDP, 180 Watts falls right in line with that.

https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482

Who do you believe? Microsoft and TPU or Nvidia paid content?
 
Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.
I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.
 
128 watts in game!? What?!

That seems crazy low to me... granted it was playing RDR2, but I expect despite being "last gen" that game should be stressing the console pretty well unless it's locked to the old framerates?

The tweet is refencing figures measured here btw https://tweakers.net/reviews/8252/3/xbox-series-x-onze-eerste-indrukken-en-tests-meten-is-weten.html

On series X it will run at 4k/30 unless a specific series X optimisation gets released. So it wouldn't have been stressing the series X that much, it would be like running rdr2 on a PC with a 3700x and 2080s at console settings capped at 30fps. I bet the GPU wouldn't be running much over 50%, and CPU at 20% or so.
 
I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.

8GB is not enough to run Doom Eternal on Ultra settings, this is how Digital Foundry got Nvidia's claimed 2X the performance over the 2080, the 2080's 8GB buffer was over saturated which dragged down its performance.

See this video.

https://www.youtube.com/watch?v=csSmiaR3RVE&feature=youtu.be&t=746

Digital Foundry, their content is not to be trusted.

Hardware Unboxed picked up on this when Digital Foundry made a review of the 3080 before the official NDA lift, Doom Eternal was also featured in Nvidia's performance slides.

It did lead to people thinking the 3080 was going to be twice as fast as the 2080, Jensen made similar claims in his kitchen on that day. A lot of people on the internet used it as a performance measure comparatively to the 2080 and 2080TI.

Long story short Hardware Unboxed reiterated his findings in the RTX 3080 review.

That is with the settings Digital Foundry used Doom Eternal would use 9GB of VRam, the RTX 2080 only has 8GB, this would cause the 2080 to choke on that lack of VRam and the performance is a lot less compared with the 3080 than it is with reduced Texture Quality setting that use only 7GB of VRam

4K Ultra Nightmare (9GB)
2080 88 FPS (100%)
2080TI 140 FPS (159%)
3080 189 (215%)

4K Ultra Nightmare, Ultra Textures (7GB)
2080 111 FPS (100%)
2080TI 140 FPS (126%)
3080 189 (170%)

To be clear, with reduced texture quality settings the 2080TI and 3080 are identical in performance, the only change is the 2080, the 11GB 2080TI was 59% faster and that changed to 26%, the 10GB 3080 was 115% faster, that changed to 70% faster.


https://youtu.be/csSmiaR3RVE?t=746

DrtSxP5.png

GTOAvQV.png
 
8GB is not enough to run Doom Eternal on Ultra settings, this is how Digital Foundry got Nvidia's claimed 2X the performance over the 2080, the 2080's 8GB buffer was over saturated which dragged down its performance.

See this video.

https://www.youtube.com/watch?v=csSmiaR3RVE&feature=youtu.be&t=746

Digital Foundry, their content is not to be trusted.
Yep, when I go 4k on Cold War, it just stutteres every 10 seconds, drive me insane and had to drop the res to 1440, so annoying really but yeah, agreed 8GB is just not enough YET in BF5, runs incredibly well at 4k at 90+fps... really weird... however yeah, I think anyone with anythign less than 11GB for absolute 4k gaming isn't going to work so that for me rulled out the 3080... hence why I'll wait for the big N for me, as it'll have all the memory I'll ever need for 4k and below PLUS the power to use it. win win... bring it on
 
Status
Not open for further replies.
Back
Top Bottom