• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I don't believe that user is a member here, and his postings seem sensible to me so I don't think he findings should be ignored because they go against a narrative.

I definitely agree that it needs to be tested, but I very much doubt there is any appetite to do it. True vram requirement testing needs measurement of 0.1/1% lows and now looks to need image quality comparisons/pop in tests comparing different GPUs with differing memory amounts. As well as the obligatory dedicated vram usage/allocated. It's a lot of work.

I had a 8GB and a 16GB GPU in the 5700 XT and Radeon VII, but Horizon Zero Dawn/Black Ops were not released back then. What i can say is that the 5700 XT had worse 0.1/1% lows and hitching in some games where VRAM usage was close to 8GB whereas the Radeon VII didn't.

If you don't have two cards of different sizes, you can't really test it. I didn't think the 5700 XT had a problem until i tried a GPU with double the memory, then i began to see and measure differences. People can be blissfully ignorant sometimes.

It's easy to use one GPU and say there's no issue, and maybe there's not but sometimes you don't notice these things until you try something different - is my point i guess.

At the time I did not know that games were coming out that would adjust image quality automatically regardless of what was set in the video options. COD BO even mentions this in the video options i believe.

I wish i still had a 5700 XT as I'd love to do some comparisons with the Radeon VII or the 6900 XT but I sold it and no chance of me getting another at a sensible price anytime soon.

I very much doubt this is just a problem with the 5700 XT btw as has been mentioned in this thread several times.

I'm not sure if @tommybhoy will want to jump in here as this was a private conversation offline, but he has mentioned that frame times on the 3070 can be all over the place once you get near to 8GB utilisation, which was similar to what I saw with the 5700 XT in a few games that used lots of video memory.
I am not saying he should be ignored. I am saying his findings should not be used as evidence or to make a point until proven :)

I mean it is not like I am bothered if it turns out to be true. Not like I plan to keep this card long term. It will very likely be sold sometime this year. I am all for new data to be found like this and if reproducible then we will have more information which we can use to make decisions on for future purchases.


Until someone else posts a video/image showing the exact same with a different 8gb gpu, you can't say for definite that is because of the lesser vram though.

Looking at other comparisons, I can't see any difference but then youtube compression.....


Remember gregster posted a video comparing his amd and nvidia comparison in games and the way his nvidia card had lesser iq, turns out that was down to his nvidia drivers or/and settings or/and not restarting the game after applying the settings or something.
+1
 
@TNA making accusations of confirmation bias when all your posts are pretty much "I'm happy with 10gb.....I love my 10gb card......I don't plan to keep my 10gb card for long.....I'll have upgraded my 10gb card......."

Tell everyone again how happy you are with your 10gb card :D
 
Until someone else posts a video/image showing the exact same with a different 8gb gpu, you can't say for definite that is because of the lesser vram though.

Looking at other comparisons, I can't see any difference but then youtube compression.....


Remember gregster posted a video comparing his amd and nvidia comparison in games and the way his nvidia card had lesser iq, turns out that was down to his nvidia drivers or/and settings or/and not restarting the game after applying the settings or something.
yeah I don't see much difference tbh
 
@TNA making accusations of confirmation bias when all your posts are pretty much "I'm happy with 10gb.....I love my 10gb card......I don't plan to keep my 10gb card for long.....I'll have upgraded my 10gb card......."

Tell everyone again how happy you are with your 10gb card :D
Hold on, what does me saying I will be selling my 3080 got to do with anything? I am very happy with the card. I even just posted less than an hour ago saying I am happy that nvidia provided a 10gb option in this very thread. Maybe you missed that post? ;)

I never keep cards for long. I upgrade every gen, hell, multiple times every gen. If I end up selling the 3080 by end of the year as I said I would have had the 3080 for over a year by then. That is pretty good. As I had Vega Nano, Vega Liquid, Vega 56 and Vega 64 over a period of a year or two.

You can choose to interpret what I am saying the way you want. But I will stick by my statements :D
 
Until someone else posts a video/image showing the exact same with a different 8gb gpu, you can't say for definite that is because of the lesser vram though.

Remember gregster posted a video comparing his amd and nvidia comparison in games and the way his nvidia card had lesser iq, turns out that was down to his nvidia drivers or/and settings or/and not restarting the game after applying the settings or something.
How about this then? Horizon Zero Dawn VRAM testing — ImgBB (ibb.co)
 
Who is that though? Anyone can slap a few images together. Not saying it is not true by the way, just that until proven it cannot be used as a fact that 10gb is not enough and would have same results as those images.

Also we need a 3080 comparison here as that is the only 10gb card :)

Just imagine if someone was doing that but using a 6900XT vs a 3090. I am sure you would not be in a rush to believe it until proven. Same thing here.
 

That's still the same guy as this though.........

IZHH1m1.png


https://www.reddit.com/r/hardware/comments/kysuk6/ive_compiled_a_list_of_claims_that_simply/gjjo7bv/

Get someone else with a different gpu to show the same.

I do have the game and played on my vega 56 and didn't notice any issues but removed it as got bored of it since I already played on my ps 4 pro, tempted to reinstall and see what the story is with my 3080 but it's 10gb vram, that and I cba tbh :p
 
Does not matter, he is testing with an 8GB GPU vs a 16GB GPU.

In terms of original thread topic, furry muff. :p

Doesn't relate to the thread though. You can't compare 8GB of GDDR6 memory with a 256bit interface on a previous gen card, against a card with 10GB of GDDR6X memory on a 320bit interface - the throughput difference is significantly different.
 
Who is that though?Just imagine if someone was doing that but using a 6900XT vs a 3090. I am sure you would not be in a rush to believe it until proven. Same thing here.
I'd be very interested in that, but we both know that won't happen. :D
That's still the same guy as this though.........

Get someone else with a different gpu to show the same.
That's easier said than done in this climate. The amount of people with two GPUs from the same brand with differing memory sizes to test this is probably miniscule... apart from reviewers.
Doesn't relate to the thread though. You can't compare 8GB of GDDR6 memory with a 256bit interface on a previous gen card, against a card with 10GB of GDDR6X memory on a 320bit interface - the throughput difference is significantly different.
Interface and bandwidth has nothing to with capacity and how it is used though. 8GB of GDDR5 and 8GB of GDDR6 will still use the same amount of video memory capacity for rendering a texture.
 
I'd be very interested in that, but we both know that won't happen. :D
Yep. I agree. Just like I am not sure it happens with the 3080 either currently :D

But how can we be sure it does not happen with a 6900XT and 3090? Would be hard to do in this climate. The amount of people with two GPUs from the same brand with differing memory sizes to test this is probably miniscule... apart from reviewers.

:p:p:p
:p;):p
:p:p:p


Looks like we can add Sniper Elite 4 to the list of games that adjusts image quality automatically. As i recall though this game is light on video memory usage so won't be a problem for 8GB cards or higher.
pedtP9q.jpg

GTX 960 2GB vs. 4GB in 2019 – Did It End Up Mattering? - YouTube
Big difference between 2-4gb and 10gb though. Maybe it does that only in extreme cases, which would make sense. Does not mean it does it with 10gb. We won’t know until proven anyways. I mean if it does, then yeah, it could be the same case between a 6900 XT vs a 3090 then no? Unlikely I agree ;)
 
I am very happy with the card. I even just posted less than an hour ago saying I am happy that nvidia provided a 10gb option in this very thread. Maybe you missed that post
No I didn't miss the post. Everyone knows you are happy with your 10gb card and you will upgrade to next gen. I think it's in every single thread in here.:D
 
No I didn't miss the post. Everyone knows you are happy with your 10gb card and you will upgrade to next gen. I think it's in every single thread in here.:D
So why the confusion? :D

Must I sell my 3800 and buy a 6800 XT to make you happy? :p

You know you like my posts, even if they wind you up a little ;)
 
Yep. I agree. Just like I am not sure it happens with the 3080 either currently :D

But how can we be sure it does not happen with a 6900XT and 3090? Would be hard to do in this climate. The amount of people with two GPUs from the same brand with differing memory sizes to test this is probably miniscule... apart from reviewers.

:p:p:p
:p;):p
:p:p:p
Big difference between 2-4gb and 10gb though. Maybe it does that only in extreme cases, which would make sense. Does not mean it does it with 10gb. We won’t know until proven anyways. I mean if it does, then yeah, it could be the same case between a 6900 XT vs a 3090 then no? Unlikely I agree ;)
I never said it did though, i clearly referenced 8GB vs 16GB, personal experiences of using those capacities and referenced that third party post.

I would like to see it tested on a 10GB card though vs say a 16GB or 24GB to see if the same behaviour happens. Not much difference between 8-10.

Would i pay for and use a flagship GPU with only 10GB of memory? Nope, but that's a different topic altogether. :p

Your spoiler banter fails if you read what i wrote. :D
 
Interface and bandwidth has nothing to with capacity and how it is used though. 8GB of GDDR5 and 8GB of GDDR6 will still use the same amount of video memory capacity for rendering a texture.

Sure, I don't dispute that, but you're basing that on a very simplistic idea of how memory is used.

If a scene needs XGB of memory to render (at the specified settings) and XGB of memory isn't available, we hit a bottleneck, and the effect of that is probably more pronounced when standing still. Playing the game is a different story as the card will be swapping assets in and out of memory, which will heavily depend on how the game was built.

Also, comparing a 5700XT and 6800(XT?) at 4K ultra settings isn't realistic unless you know how the game was built; diluted textures can also be a symptom weaker processing performance.
 
Sure, I don't dispute that, but you're basing that on a very simplistic idea of how memory is used.

If a scene needs XGB of memory to render (at the specified settings) and XGB of memory isn't available, we hit a bottleneck, and the effect of that is probably more pronounced when standing still. Playing the game is a different story as the card will be swapping assets in and out of memory, which will heavily depend on how the game was built.

Also, comparing a 5700XT and 6800(XT?) at 4K ultra settings isn't realistic unless you know how the game was built; diluted textures can also be a symptom weaker processing performance.
It's actually a very good example, the 5700 XT and 6800 use the same memory type, speed, timings, only the capacity is different. Also the GPU architecture is very similar so it is a near perfect test case.

I disagree, if a bottleneck is met it would be more pronounced as you move around as more texture swapping would in theory be occurring. Standing still would show it in the best light, but to capture accurate screenshots and comparisons you have to stand still and look for areas where you can accurately compare image quality.

It's no good moving about and recording video as we all know what happens with encoded videos and uploading to YouTube.

Testing at 4K is entirely realistic. Testing should be done at all resolutions. It's not just one game that this happens in.
 
Status
Not open for further replies.
Back
Top Bottom