Soldato
Consoles don't run Ultra High Preset.
Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Consoles don't run Ultra High Preset.
+1Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.
Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.
Zen 3 will b in stores on 5 November
I always enjoy those Digital Foundry videos, "If you zoom in 10x you can see at ultra settings' that the edge of this shadow is a lot more diffuse than at lower settings". Brilliant, that shadow that I ran past at top speed to avoid a hail of bullets?
Not sure where you've gotten that from, it's 4k/60 on series X. A quick Google search will confirm.series x runs Valhalla at 4k 30fps - settings have not been published but that is what the games runs at.
Sounds about right anyway, the PC recommended specs for 4k 30fps is a RTX2080 and recommended for 4k 60fps is a RTX3080
Lots of graphics horsepower at low power draw
https://twitter.com/architectu2/status/1317227778501201922?s=19
https://twitter.com/xboxseriesnews1/status/1316749680592781312?s=19
it says 350w mate above your post128 watts in game!? What?!
That seems crazy low to me... granted it was playing RDR2, but I expect despite being "last gen" that game should be stressing the console pretty well unless it's locked to the old framerates?
The tweet is refencing figures measured here btw https://tweakers.net/reviews/8252/3/xbox-series-x-onze-eerste-indrukken-en-tests-meten-is-weten.html
it says 350w mate above your post
what do you mean?
it says 350w mate above your post
I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.Ultra settings are just for epeen anyway. Most of the time the visual difference is only noticeable by analysing screen shots.
128 watts in game!? What?!
That seems crazy low to me... granted it was playing RDR2, but I expect despite being "last gen" that game should be stressing the console pretty well unless it's locked to the old framerates?
The tweet is refencing figures measured here btw https://tweakers.net/reviews/8252/3/xbox-series-x-onze-eerste-indrukken-en-tests-meten-is-weten.html
I'd prefer ultra settings in a game I've paid for, so for me a major thing. I've been playing Cold War at 1440p and the difference between that and 4k is pronounced, but so is the performance dip with terrible memory leakage imho on the 5700xt at the minute, it's a beta but for me, I want the best visuals beings as a PC costs thousands of pounds.
Hardware Unboxed picked up on this when Digital Foundry made a review of the 3080 before the official NDA lift, Doom Eternal was also featured in Nvidia's performance slides.
It did lead to people thinking the 3080 was going to be twice as fast as the 2080, Jensen made similar claims in his kitchen on that day. A lot of people on the internet used it as a performance measure comparatively to the 2080 and 2080TI.
Long story short Hardware Unboxed reiterated his findings in the RTX 3080 review.
That is with the settings Digital Foundry used Doom Eternal would use 9GB of VRam, the RTX 2080 only has 8GB, this would cause the 2080 to choke on that lack of VRam and the performance is a lot less compared with the 3080 than it is with reduced Texture Quality setting that use only 7GB of VRam
4K Ultra Nightmare (9GB)
2080 88 FPS (100%)
2080TI 140 FPS (159%)
3080 189 (215%)
4K Ultra Nightmare, Ultra Textures (7GB)
2080 111 FPS (100%)
2080TI 140 FPS (126%)
3080 189 (170%)
To be clear, with reduced texture quality settings the 2080TI and 3080 are identical in performance, the only change is the 2080, the 11GB 2080TI was 59% faster and that changed to 26%, the 10GB 3080 was 115% faster, that changed to 70% faster.
https://youtu.be/csSmiaR3RVE?t=746
Yep, when I go 4k on Cold War, it just stutteres every 10 seconds, drive me insane and had to drop the res to 1440, so annoying really but yeah, agreed 8GB is just not enough YET in BF5, runs incredibly well at 4k at 90+fps... really weird... however yeah, I think anyone with anythign less than 11GB for absolute 4k gaming isn't going to work so that for me rulled out the 3080... hence why I'll wait for the big N for me, as it'll have all the memory I'll ever need for 4k and below PLUS the power to use it. win win... bring it on8GB is not enough to run Doom Eternal on Ultra settings, this is how Digital Foundry got Nvidia's claimed 2X the performance over the 2080, the 2080's 8GB buffer was over saturated which dragged down its performance.
See this video.
https://www.youtube.com/watch?v=csSmiaR3RVE&feature=youtu.be&t=746
Digital Foundry, their content is not to be trusted.