• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I play Halo Infinite on a 3840X1080 screen (Basically a little more than 1440p) and noticed how it shows 11gb "usage" at ultra settings on my 6800XT. When I saw that I got a chuckle because I'm sure AMD had some influence there. Nvidia did it to AMD with Crysis and unnecessary tessellation and now it seems AMD is returning the favor with the titles they are involved with and vram.

That's another thing about vram. It seems to be rather easy to blow out *any size* buffer if a developer chooses to do so.
 
But weren't people holding TPU/wizard comments in high regards because he said about vram issues or something, now all of a sudden, "clearly never played the game", goalposts right there ladies and gentlemen :cry: Just like the fury x comments of "4gb being enough for 4k" :D

+25%? Its a 2070 with mustard up its bottom, What did you have before and how much did you pay for your over excited 2070? @TNA

And yet you come to the conclusion "it's basically the same", not see the flaw here, also depending on res. and game, it is more than 25% at times....

Like I said, coming from a 2070s for a potential outlay of £450+? Is it worth it? Nope. Had you sold on your 2070s to bring the 3070 down to say <£200, then depending on ones needs/wants, it "might" be worth it.


And if you read through the thread, you will see it was debunked. In fact HU did 2 videos as they got called out for their results showing nvidia being better than amd, turns out the testing scene they used had more action/enemies where as other sites tested areas with nothing happening :cry:
 
  • Haha
Reactions: TNA
I'd forgotten about Halo Infinite.

@tommybhoy Did you ever try out Resident Evil 2 with the RT patch that was added?
I play Halo Infinite on a 3840X1080 screen (Basically a little more than 1440p) and noticed how it shows 11gb "usage" at ultra settings on my 6800XT. When I saw that I got a chuckle because I'm sure AMD had some influence there. Nvidia did it to AMD with Crysis and unnecessary tessellation and now it seems AMD is returning the favor with the titles they are involved with and vram.

That's another thing about vram. It seems to be rather easy to blow out *any size* buffer if a developer chooses to do so.
I'd be curious to see how it runs the level shown in this video on your 3080 Twinz, with an overlay showing your frame times at 4K max settings, with the HD Texture pack installed.
 
I'd forgotten about Halo Infinite.

@tommybhoy Did you ever try out Resident Evil 2 with the RT patch that was added?

I'd be curious to see how it runs the level shown in this video on your 3080 Twinz, with an overlay showing your frame times at 4K max settings, with the HD Texture pack installed.
He did, see here where apparently the 3080 "***** itself" but seems to hold up well on my end....


Would be good to see your footage though tommy and I can get the same area captured on my end.
 
I play Halo Infinite on a 3840X1080 screen (Basically a little more than 1440p) and noticed how it shows 11gb "usage" at ultra settings on my 6800XT. When I saw that I got a chuckle because I'm sure AMD had some influence there. Nvidia did it to AMD with Crysis and unnecessary tessellation and now it seems AMD is returning the favor with the titles they are involved with and vram.

That's another thing about vram. It seems to be rather easy to blow out *any size* buffer if a developer chooses to do so.

That's not how it works, it either needs those assets in the buffer or it doesn't.

Memory on GPU's is a hierarchy, just as it is on CPU's.

It starts with the fastest memory of all, your L1 cache, if it doesn't fit in there its ejected to L2, if it doesn't fit in there it gets ejected to L3 cache, RDNA2 GPU's have this, they call it Infinity Cache, its a level 3 cache on the GPU, in the case of your Ampere GPU's its your GDDR6 memory, if it doesn't fit in there it gets ejected to your System RAM, which is very much slower than your GDDR6 buffer, so lower performance, it can even cause frame stalls. Stutters, on your £700 GPU.

Consoles have more accessible memory than most of Nvidia's line up, Halo Infinite is a console game, built for consoles and then ported to PC. Its nothing to do with AMD, your GPU just isn't as good as a console.
 
That's not how it works, it either needs those assets in the buffer or it doesn't.

Memory on GPU's is a hierarchy, just as it is on CPU's.

It starts with the fastest memory of all, your L1 cache, if it doesn't fit in there its ejected to L2, if it doesn't fit in there it gets ejected to L3 cache, RDNA2 GPU's have this, they call it Infinity Cache, its a level 3 cache on the GPU, in the case of your Ampere GPU's its your GDDR6 memory, if it doesn't fit in there it gets ejected to your System RAM, which is very much slower than your GDDR6 buffer, so lower performance, it can even cause frame stalls. Stutters, on your £700 GPU.

Consoles have more accessible memory than most of Nvidia's line up, Halo Infinite is a console game, built for consoles and then ported to PC. Its nothing to do with AMD, your GPU just isn't as good as a console.

Thoughts on the HUB video above humbug where they look at every reviewer results and testing scenarios then redo their tests.
 
I'm sure it agrees with everything you say, doesn't debunk what others have found in their testing, HUB are just testing it differently, perhaps not as thoroughly.

If you watch the video, you will see they looked at every major reviewer piece on halo and the scenes tested and most importantly, all backed up in some form and shape, no matter what your opinion of HUB it is, it is a good video.

I also did some of my own testing both with rebar on and off (iirc rebar provided a 7fps boost overall) and didn't find any issues with "vram".

Also, can you explain why the 6800xt is showing some frame spikes in pcgamershardware benchmark review (they apparently are the most reputable/trustworthy because of their FC 6 benchmark) too?


Surely we shouldn't be seeing such frame spikes because of all that vram and infinity cache action?
 
I'm not sure if there is any issue with Halo Infinite at 4K max settings with the HD Texture pack, but that's why I'd like to see a long gameplay video (30+ minutes) with a frame time graph overlay running, showing video memory allocation/usage and any variances in frame time delivery. If you see spikes and allocation is high at 10GB, then that would point to an issue that FPS graphs won't detect. Not saying there definitely is one as I'm not aware of any, but am now curious to see it having done some research into the game and user feedback from actual owners on external forums.
 
Also, well worth watching DF video on halo:


They touch upon stutter issues as well as what settings consoles are compromising on which high end gpus don't have to make.
 
Yes that's on the list, it runs awful in certain scenes/ctd.

With Halo it runs great indoors, it's when you go outside, the ft's tank.

Yet that is where "all" the reviewers have tested too in halo.... Are we now just completely ignoring reviewers? :cry:

Can you upload footage of these scenes for RE 2 please as it's the first I have heard of it/seen, as mentioned I know rdna 2 fps plummets massively because of the RT but not seen issues with a 3080 "fps tanking" as shown in my footage. I have read that apparently when using HDR in conjunction with RT it causes severe performance issues on all gpus, I had read HDR was "fake/poor" for RE 2 though so not bothered with it.
 
Yes that's on the list, it runs awful in certain scenes/ctd.

With Halo it runs great indoors, it's when you go outside, the ft's tank.
Actually it's important to show the 1% lows now when testing, as well as the typical current and average FPS.

In all my YouTube videos I uploaded now all include metrics like 1% lows, because they will certainly show any issues with memory saturation or bad frame times.
u2PFoe8.png
That can all be setup using MSI AB overlay. Use the benchmarking feature to capture your frame times.
 
If you watch the video, you will see they looked at every major reviewer piece on halo and the scenes tested and most importantly, all backed up in some form and shape, no matter what your opinion of HUB it is, it is a good video.

I also did some of my own testing both with rebar on and off (iirc rebar provided a 7fps boost overall) and didn't find any issues with "vram".

Also, can you explain why the 6800xt is showing some frame spikes in pcgamershardware benchmark review (they apparently are the most reputable/trustworthy because of their FC 6 benchmark) too?


Surely we shouldn't be seeing such frame spikes because of all that vram and infinity cache action?

ReBar just widens the CPU's access bandwidth to the GPU's buffer, so for example instead of your CPU's bandwidth to the GPU being 1GB/s its 10GB/s (not a real representation) so instead of it doing "draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s > draw 1GB 1s >"
it does "draw 10GB 1s" its a bottleneck reduction, it can do more in less time.

On a side note something is fundamentally wrong with Intel's GPU side thread scheduling, what AMD call ACE units, its physical hardware on the GPU, Nvidia do through the drivers, software thread scheduling, however Intel do it its ##### this is why Intel have to have ReBar enabled.

Anyway, as i said, i'll watch it later :)
 
Status
Not open for further replies.
Back
Top Bottom