• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I never said it did though, i clearly referenced 8GB vs 16GB, personal experiences of using those capacities and referenced that third party post.

I would like to see it tested on a 10GB card though vs say a 16GB or 24GB to see if the same behaviour happens. Not much difference between 8-10.

Would i pay for and use a flagship GPU with only 10GB of memory? Nope, but that's a different topic altogether. :p

Your spoiler banter fails if you read what i wrote. :D
Lol. Okey dokey.

So to conclude, there is no evidence 3080 with it’s 10gb has issues with texture pop in yet. But we would all look forward to a reputable source to take a look at this :D

Also, let me repost the link from a reputable source that got this all going again:

https://www.youtube.com/watch?t=747&v=dAtsqtYIF5U&feature=youtu.be

Make sure you watch from 12.27 to 12.44. That really should but this to bed :D

I would even say from 12.27 to 12.40 as the finish sounds more fitting to this thread and my prediction :p
 
Lol. Okey dokey.

So to conclude, there is no evidence 3080 with it’s 10gb has issues with texture pop in yet. But we would all look forward to a reputable source to take a look at this :D

Also, let me repost the link from a reputable source that got this all going again:

https://www.youtube.com/watch?t=747&v=dAtsqtYIF5U&feature=youtu.be

Make sure you watch from 12.27 to 12.44. That really should but this to bed :D

I would even say from 12.27 to 12.40 as the finish sounds more fitting to this thread and my prediction :p
Starting to worry about you now TNA. Am i conversing with a wall of bricks? :p

No testing done at all in the video you just linked, based on what we have just spent pages discussing.

From your video, quotes: "Appears to, "I don't anticipate".

Does that sound like any testing was done at all based on what we've just been talking about?
 
Starting to worry about you now TNA. Am i conversing with a wall of bricks? :p

No testing done at all in the video you just linked, based on what we have just spent pages discussing.

From your video, quotes: "Appears to, "I don't anticipate".

Does that sound like any testing was done at all based on what we've just been talking about?
Lol. Come on Matt. I never claimed the video did any testing. I clearly said before posting that link that there was no evidence (which would clearly mean no tests by any credible source), so I do not see how you would jump to that conclusion. The reason I used that link was to show that as of right now 10gb is perfectly fine thus far ;)
 
Lol. Come on Matt. I never claimed the video did any testing. I clearly said before posting that link that there was no evidence (which would clear mean no tests), so I do not see how you would jump to that conclusion. The reason I used that link was to show that as of right now 10gb is perfectly fine thus far ;)
Lol. Okey dokey.
Make sure you watch from 12.27 to 12.44. That really should but this to bed
I'm only going off what you wrote.

You said it needs testing, that there has been no testing, then quote a video where there was no testing and say this is a video that shows it is perfectly fine and puts it all to bed.

You are making my head hurt. :D
 
It's actually a very good example, the 5700 XT and 6800 use the same memory type, speed, timings, only the capacity is different. Also the GPU architecture is very similar so it is a near perfect test case.

I disagree, if a bottleneck is met it would be more pronounced as you move around as more texture swapping would in theory be occurring. Standing still would show it in the best light, but to capture accurate screenshots and comparisons you have to stand still and look for areas where you can accurately compare image quality.

It's no good moving about and recording video as we all know what happens with encoded videos and uploading to YouTube.

Testing at 4K is entirely realistic. Testing should be done at all resolutions. It's not just one game that this happens in.

I don't think anyone will dispute that the 5700XT is worse than the 6800, especially at 4K where memory quantity becomes more important. It also makes sense, and the screenshots prove, that controls are in place to reduce certain settings to meet whatever limitations each card has.

That being said, you can't extrapolate from those results that the RTX 3080 would suffer the same issue as it's a significantly better card. If the issue is specifically a memory capacity issue then sure, but testing a card with equal rasturisation will also matter when it comes to texture quality.

I unfortunately can't test the game with my 3080 at 4K as I play at 1440p, although I'd be more interested in gameplay performance as opposed to screenshots, but as you said, it's hard to compare compressed videos.
 
That being said, you can't extrapolate from those results that the RTX 3080 would suffer the same issue as it's a significantly better card. If the issue is specifically a memory capacity issue then sure, but testing a card with equal rasturisation will also matter when it comes to texture quality.

I unfortunately can't test the game with my 3080 at 4K as I play at 1440p, although I'd be more interested in gameplay performance as opposed to screenshots, but as you said, it's hard to compare compressed videos.
I didn't extrapolate that, but i would like to see it tested to find out either way and agree with you.

Until it is tested on different GPUs it will just be incorrectly labelled as a 5700 XT issue i suspect.
 
I'm only going off what you wrote.

You said it needs testing, that there has been no testing, then quote a video where there was no testing and say this is a video that shows it is perfectly fine and puts it all to bed.

You are making my head hurt. :D
The quote was made to put it to bed, as in here is a reputable source that tests cards all the time and he is saying 10gb is fine. That is it. Not my fault you understood differently and gave yourself a headache :p
 
The quote was made to put it to bed, as in here is a reputable source that tests cards all the time and he is saying 10gb is fine. That is it. Not my fault you understood differently and gave yourself a headache :p
Whoosh.
 
The quote was made to put it to bed, as in here is a reputable source that tests cards all the time and he is saying 10gb is fine. That is it. Not my fault you understood differently and gave yourself a headache :p

Yeah but he is saying 10gb is fine in a video that does not show any tests to say 10gb is fine. It has no evidence either way.

Thats like saying its not raining outside its fine when you have not looked outside or consulted the weather online or on TV.
 
Yeah but he is saying 10gb is fine in a video that does not show any tests to say 10gb is fine. It has no evidence either way.

Thats like saying its not raining outside its fine when you have not looked outside or consulted the weather online or on TV.
Dude. I got that. But had such a thing been a known issue, I am sure he would have said so no? Or you think he would gloss over that fact when talking about 10 vs 16gb? Basically thus far there is no evidence that 10gb is not enough, if there were he would not have said what he said. Nothing to do with if the video set out to test it or not :)
 
IZHH1m1.png


https://www.reddit.com/r/hardware/comments/kysuk6/ive_compiled_a_list_of_claims_that_simply/gjjo7bv/

That appears to be what happens when using a slow storage system, e.g. HD instead of SSD, and amplified at times due to low system RAM. This is the issue that Direct IO/RTX IO is aiming to solve. It may be more apparent when using a GPU with lower VRAM due to assets being swapped more frequently. The only way to solve it with current game design would be to create cards with more VRAM than games have texture data, or to put it another way, this will still happen no matter how much VRAM you have until game design makes use of Direct IO/RTX IO.
 
Dude. I got that. But had such a thing been a known issue, I am sure he would have said so no? Or you think he would gloss over that fact when talking about 10 vs 16gb? Basically thus far there is no evidence that 10gb is not enough, if there were he would not have said what he said. Nothing to do with if the video set out to test it or not :)

I think 10gb is enough but I have seen on here evidence in certain circumstances its not enough, ie 4k. Therefore if you are trying to prove your point you need to quote someone showing evidence to back up what you are saying or you will continue to get an arguement for the other side. You wanted these people against your arguement to take someones word in a Youtube video. Its hardly credible and you are not doing yourself any favours.
 
I think 10gb is enough but I have seen on here evidence in certain circumstances its not enough, ie 4k. Therefore if you are trying to prove your point you need to quote someone showing evidence to back up what you are saying or you will continue to get an arguement for the other side. You wanted these people against your arguement to take someones word in a Youtube video. Its hardly credible and you are not doing yourself any favours.
One is a reputable youtuber, the other is a random post on reddit. Yeah, that is the same.

Also not only that, no other reputable source has said 10gb is currently not enough either.

End of the day I am not trying to write a scientific paper about the bloody topic. Just some fun banter, lets keep that mind :p
 
my 2p's;

Don't forget that GDDR6X is faster than GDDR6. 6X comes into it's own at higher resolutions from what I've been reading.

HAd to run my system RAM at stock yesterday to pinpoint a crash in UNreal engines games 3600Mhz>2400Mhz and boy does that make a difference to frame variance and particularly the fps lows. Today needs a new run through of the recent DRAM calc for me and apply those settings, as I cant get my RAM stable at it's 3600Mhz spec. :(
 
One is a reputable youtuber, the other is a random post on reddit. Yeah, that is the same.

Also not only that, no other reputable source has said 10gb is currently not enough either.

End of the day I am not trying to write a scientific paper about the bloody topic. Just some fun banter, lets keep that mind :p

You should know by now that VRAM discussion is not fun banter it gets quite heated. Its almost the same as Nvidia v AMD. So if you are going to join in expect fireworks. :) Why do you think this is 149 pages long ! :p
 
You should know by now that VRAM discussion is not fun banter it gets quite heated. Its almost the same as Nvidia v AMD. So if you are going to join in expect fireworks. :) Why do you think this is 149 pages long ! :p
Haha. Yeah I guess it can get that way :p

I do enjoy the topic, hence why when watching the HU video that part stood out and I posted it here which got the thread back up and running again. Lol :D
 
Going from a 5700XT 8GB to a 3080 10GB gaming at 1440p, absolutely no issues yet. I don't regret it at all, but I do regret getting a Gigabyte, but then stock issues, so didn't really have a choice as I wanted to get a card before January. I had seen the other day HUB was doing a laptop review and even on laptops you can get a 16GB 3080 mobile chip, according to their spec sheet.
 
Status
Not open for further replies.
Back
Top Bottom