Horizon Zero Dawn (it was posted earlier in this thread) and Call of Duty Black Ops are two recent games that i know of.Any evidence of that?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Horizon Zero Dawn (it was posted earlier in this thread) and Call of Duty Black Ops are two recent games that i know of.Any evidence of that?
I forget who posted it, but they referenced a reddit thread from a 5700 XT user and noted that the game was changing his texture rate as noted by different screen captures regardless of what was set in the video options.I’m playing HZD at the moment so I’ll take a look at that this evening.
One thing I did notice last night was that since capping FPS for GSync maxed out at 4K my lowly 3090FE only got up to 230W even with a hefty OC, with a max temp of 60C.
Thought I’d read somewhere that this game was taxing on systems but it’s clearly not.
Not sure it could manage 120Hz but I might try it. Only played the Doom games at 120Hz so far.
It will be a long time before review sites start testing for it i suspect. It'll just be check the dedicated usage/allocated and call it a day. It is what it is but that testing methodology is not sufficient anymore.Thing is, until it is proven either by a trusted source or multiple times then anyone can say x or y on reddit etc which does not make it so.
@LtMatt do you agree that DLSS looks better than native?![]()
Yeah I agree better to have it. But when the prices are what they are right now things change a little. Also as I say if you know for a fact that you will be selling and moving on to next gen and not keeping it long term, it is even more of a less issue.It will be a long time before review sites start testing for it i suspect. It'll just be check the dedicated usage/allocated and call it a day. It is what it is.
If you've experienced the
The old saying goes though, better to have something and not need it, than need something and not have it. That certainly applies to video memory for me having come from an 8GB card.
I'll take native any day, but that's just me.
Agreed. I have been saying this from the start. Some did not like itI'll take native any day, but that's just me.
A little off topic but given recent posts from @Smiffy-UK, the GDDR6 used on the MBA 6800 XT and 6900 XT has a maximum temperature of 100c.
Once you hit this temperature the GPU will throttle core and memory speed until temperatures drop, or you alt tab out to reapply a GPU Tuning profile.
The temperatures below are from a 3 hour session of Red Dead Redemption Ultra settings at 4K, in a 23c room, with fan speeds on the GPU completely silent at 1500RPM, which is the default 6900 XT fan speed.
The maximum temperatures are on the right and are peak, typically temperatures in game are a couple of C lower.
The 6900 XT memory junction temperature will peak at 94c, 6c away from throttling temperature. The core Junction temperature peaks at 104c, also 6c away from throttling temperature.
The difference between edge and core junction temperature is 14c peak, which indicates excellent contact between heatsink, thermal pad and GPU die.
![]()
People worry about temperatures, but as long as you're below the maximum temperature it will not affect the life span of the GPU. Voltage will kill your GPU faster than temperature will.
2400Mhz core clock 2112Mhz memory, 0% power limit. No Increased power limits this is 24/7 gaming clocks. I favour low RPM fan speed over low temperatures. My 420MM AIO running at peak of 1000RPM.Is your memory overclocked? And are you using stock or high power limit?
2400Mhz core clock 2112Mhz memory, 0% power limit. No Increased power limits this is 24/7 gaming clocks. I favour low RPM fan speed over low temperatures. My 420MM AIO running at peak of 1000RPM.
No the core junction is the first to throttle if I increase voltage and don’t increase fan speed to keep things in check. Never seen memory junction go past 95c AFAIK.Can you try to increase the power limit and check again?
I'm curious if the memory temps will move into throttling range.
Any evidence of that?
Horizon Zero Dawn (it was posted earlier in this thread) and Call of Duty Black Ops are two recent games that i know of.
That was it, good find. That is how you test it, you need two or more GPUs with varying sizes of video memory.This post
Woah did not expect that.This post
It will get worse over time, not better. As new games are released that are developed for the new consoles, VRAM required will inevitably increase. 3080 is a ticking time bomb @ 4k.
Doesn't look like that on a 3080 though so it would need more testing other than some randomer on reddit. Does it happen with other 8GB cards like the 2070S, 3070 etc or is it a 5700XT issue?
Exactly this. The performance is more important than the vram right now. By next year we will be rocking next gen cards with more vram anyways. So I am well happy they went with 10gb to give us all a lower cost option.We all know deep down also though that a 3000 series card with 16+gig vram and the horsepower of the 3080 is going to be nowhere near £650, so personally I would rather have my 10 gig 3080 at this price than a higher vram option which would likely be out of my budget.
That just seems like confirmation bias. For all we know that could be 4K8K. You do remember him don’t you? He kept going on about AMD having much better image quality than nvidia. Then when we digged deeper turns out he was using a decade old or older laptop with an nvidia card as a comparison. Lol.That was it, good find. That is how you test it, you need two or more GPUs with varying sizes of video memory.
This post
I don't believe that user is a member here, and his postings seem sensible to me so I don't think he findings should be ignored because they go against a narrative.That just seems like confirmation bias. For all we know that could be 4K8K. You do remember him don’t you? He kept going on about AMD having much better image quality than nvidia. Then when we digged deeper turns out he was using a decade old or older laptop with an nvidia card as a comparison. Lol.
Until we get many people here to confirm it, or a reputable website/YouTuber, then it is as credible as 4K8K’s scientific paper
Funnily enough, is he not rocking a 3080 these days? Haha.