• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
And in the end, it all turned out to be true and validated by at least three actual owners from this forum.

Yes, well it is true otherwise all these full time channels and the handful of us that explored it are collectively wrong. The thread would have been 7 pages only if the denial brigade would have tested themselves, watched tech channels or trusted other members. Are you sure it wasn't one forum member Matthew? :p
 
corgi-on-a-merry-go-round


lol.gif


gIn3sMDLQLpUZrWZH_YT_ofqpIPqij1WQufZ0ihsrRk4x5NTT2AJwD_hW5XeT4PFEQ-7SPGTwPNuCGYas1P2__ydOq5hZFDm-A=s0-d

:D
:cry:
 
Been out all day. Blimey...

Just for clarity:
Who has denied Bill etc. have issues? Bill even stated he isn't 100% convinced it's a vram issue.
I said that going from menu to gameplay, with the resultant effect of performance dropping from 70fps to 20fps (averages) did not seem vram related to me, as it happens at 4k and 3840x1600. I put this down to a bug.
Although Bill and others have said it happens below 4K too, I'll take their word for it on that.
I don't recall having a vram issue at lower than 4k, but please correct me if I'm wrong. I have had a very quick search of my posts - I did have the issue where going into menu and back reduced performance from 70fps to 20fps (averages), at multiple resolutions, but think this was a glitch rather than vram.

Running the benchmark at 4k, I believe I did have vram limitations. It would have been unplayable, so I would have just disabled the HD pack. No biggie for me.
 
Seems someone is now having issues with counting :cry: Since when does 2 reviewers equate to "all" the reviewers out there.... Or are those ones not on the list of reputable reviewers/tech press now :cry:

Been out all day. Blimey...

Just for clarity:

I said that going from menu to gameplay, with the resultant effect of performance dropping from 70fps to 20fps (averages) did not seem vram related to me, as it happens at 4k and 3840x1600. I put this down to a bug.

I don't recall having a vram issue at lower than 4k, but please correct me if I'm wrong. I have had a very quick search of my posts - I did have the issue where going into menu and back reduced performance from 70fps to 20fps (averages), at multiple resolutions, but think this was a glitch rather than vram.

Running the benchmark at 4k, I believe I did have vram limitations. It would have been unplayable, so I would have just disabled the HD pack. No biggie for me.

Thanks for confirming Bill.

Was aware of the menu issue but sadly some would insist on that also being down to vram even though it was/has been acknowledged as an issue by developers (no clue if it has been fixed yet as haven't checked their forums in a while). Do recall of you using an UW 1600 screen and saying it played without issues too. IIRC, didn't you state in game 4k was playing ok on your end, at least no fps plummets to 1-5fps?
 
@Joxeon you can scower the threads I have posted and you will find the only points I have made historically regarding vram have been about 4k or UW 1440p. There is a massive difference though from one person and a couple of games to the reality. All people do in your scenario is ignore the frequency and pretend it does not exist. If your gaming on other conditions it has nothing to do with what I have been mentioning and all is good with the world! :p
There are still many people with older cards like GTX 1080's etc that game at 4K with reduced settings without running into VRAM issues despite only having 8gb and the reason for this is once you start dropping settings to maintain 60fps your VRAM usage drops. Infact I'd say that the majority of 4K gamers are running cards with less than 10gb VRAM.
 
IIRC, didn't you state in game 4k was playing ok on your end, at least no fps plummets to 1-5fps?
Honestly can't remember. Main screen upstairs, 4k downstairs and it was a faff going from one to the other all the time (can't use a keyboard when using downstairs screen). Just found out it's free for a weekend, so I'll try it again Sunday/Monday.

As far as I remember, UW 1600 was fine until it did the random drop to 20 odd (not a stutter, it stayed there. First point was coming out of the sewer into the building area, then later often, but not always, trigged by using menu) but the 4k bench was a disaster. Can't recall if I got as far as 4k gameplay, but I'll look at my old posts and also give it another go in a couple of days.

EDIT - Just checked, don't think I did gameplay at 4k. It's not a game I'm too interested in, but wanted to do tests to see how it went. All my gameplay references were on monitor (to use M&K) at UW 1600.
 
Last edited:
Yes, well it is true otherwise all these full time channels and the handful of us that explored it are collectively wrong. The thread would have been 7 pages only if the denial brigade would have tested themselves, watched tech channels or trusted other members. Are you sure it wasn't one forum member Matthew? :p

Still wondering how this got to 300 pages. Ah well it will be replaced by 12gb RAM enough discuss 4080? When it appears next year...
 
Still wondering how this got to 300 pages. Ah well it will be replaced by 12gb RAM enough discuss 4080? When it appears next year...

I'm so hoping amd stick to 16gb and nvidia exceed that vram then watch as someone creates a thread "is 16gb enough" and finds 1 scenario to hammer in that 16gb isn't enough for "multiple games"..... the mental gymnastics will be glorious, it'll be just like the fury x days all over again :cry:
 
Well it played fine on my end so *shrugs* Sorry forgot I need to force rebar on so as to gimp my setup and then encounter said issues :cry:



PS. just for a laugh, the fury x days ;) :p

Christ for over a thousand pounds bud i want my games maxed out! lol

2560'1440P i have everything maxed, did use AMD CHS shadows but it causes block shadows instead, advanced i run all maxed out -frame scaling.
Not going to happen on two gpu's bud, it's why 4GB is sufficient for 4K, even with two gpu's. To use more than 4GB consistently you'll need three or more gpu's at 4K. :)

That said i tune my settings to keep certain FPS ranges so I'm always in the (FreeSync) range. :p
How about everything maxed out at 4K including in the advanced tab minus any type of AA, What sort of performance would we be looking at for a single Fury X ?
I've not tested it, but the FPS would likely be too low for my liking. There would be dips under 40FPS for sure.
 
  • Haha
Reactions: TNA
I'm so hoping amd stick to 16gb and nvidia exceed that vram then watch as someone creates a thread "is 16gb enough" and finds 1 scenario to hammer in that 16gb isn't enough for "multiple games"..... the mental gymnastics will be glorious, it'll be just like the fury x days all over again :cry:
Given the increases in games utilizing RT over the next couple of years coupled with the newer more powerful hardware soon to be arriving then I'd imagine anyone on current crop of AMD cards will be having to drop settings more frequently as even the RT-lite AMD sponsored titles which we got this gen would come to an end if RDNA3 proves more capable.
 
Given the increases in games utilizing RT over the next couple of years coupled with the newer more powerful hardware soon to be arriving then I'd imagine anyone on current crop of AMD cards will be having to drop settings more frequently as even the RT-lite AMD sponsored titles which we got this gen would come to an end if RDNA3 proves more capable.

Exactly.

I pointed that out a few times now of how amd owners are having to turn down RT settings or sacrifice them entirely since "day 1" (especially since amd didn't have any upscaling tech to compete with dlss until only a couple of months back there) and in more than 2 handfuls of games, no amount of vram can fix that.... But for some reason, no one makes noise about that, oh sorry, "it makes no difference to visuals" *awaits for matt to post a screenshot/video where RT isn't in use* ;) :cry: :D

I don't have any faith in RDNA 3 RT perf as no leaks or any word at all except for amd saying "it is more advanced than rdna 2" :( But hoping to be pleasantly surprised as it means I can go back to amd and then watch as RT is claimed as the best thing ever :p
 
Exactly.

I pointed that out a few times now of how amd owners are having to turn down RT settings or sacrifice them entirely since "day 1" (especially since amd didn't have any upscaling tech to compete with dlss until only a couple of months back there) and in more than 2 handfuls of games, no amount of vram can fix that.... But for some reason, no one makes noise about that, oh sorry, "it makes no difference to visuals" *awaits for matt to post a screenshot/video where RT isn't in use* ;) :cry: :D

I don't have any faith in RDNA 3 RT perf as no leaks or any word at all except for amd saying "it is more advanced than rdna 2" :( But hoping to be pleasantly surprised as it means I can go back to amd and then watch as RT is claimed as the best thing ever :p

This may be true, but not really the question of this thread.
 
This may be true, but not really the question of this thread.
Yup I agree as we have a thread for it here:

 
As far as I remember, UW 1600 was fine until it did the random drop to 20 odd (not a stutter, it stayed there. First point was coming out of the sewer into the building area, then later often, but not always, trigged by using menu) but the 4k bench was a disaster. Can't recall if I got as far as 4k gameplay, but I'll look at my old posts and also give it another go in a couple of days.

EDIT - Just checked, don't think I did gameplay at 4k. It's not a game I'm too interested in, but wanted to do tests to see how it went. All my gameplay references were on monitor (to use M&K) at UW 1600.

If you get chance would be nice - its not like debunker would admit after all these months it affects his setup. ;)

Don't go out of your way though Bill, its something we done to death - no need to preach to the converted. It would help remove your doubt and test if it was indeed a game bug or hardware limitation though.
 
Status
Not open for further replies.
Back
Top Bottom