• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
RE2 **** it at launch in the exact same places the 3070 did in @Tired9 vid(when you ran from the back to front of the precinct building), different build eh, but haven't played it since, but I don't need to argue about it if it isn't doing it now and the 10Gb's sufficient as I'll take it as a win that it doesn't run out anymore.

Maybe MS/Nv can fix MSFS so that it doesn't run out of vram anymore as well, but they might not be able to and 10Gb is insufficient in this case too.
I prefer straight shooters saying, I can't afford it or the card choice gives me a better option.


That right there is all that needs saying, you buy what you can afford, live with the best gpu at that pp-if it's not one of the top gpu's in the range then obviously it can't do everything as better options are there for a reason.

At the end of the day no matter how good anyone thinks a 3080 is, it's not bullet proof due to the vram which runs out on a few occasions, still a great gpu.
 
RE2 **** it at launch in the exact same places the 3070 did in @Tired9 vid(when you ran from the back to front of the precinct building), different build eh, but haven't played it since, but I don't need to argue about it if it isn't doing it now and the 10Gb's sufficient as I'll take it as a win that it doesn't run out anymore.

Maybe MS/Nv can fix MSFS so that it doesn't run out of vram anymore as well, but they might not be able to and 10Gb is insufficient in this case too.



That right there is all that needs saying, you buy what you can afford, live with the best gpu at that pp-if it's not one of the top gpu's in the range then obviously it can't do everything as better options are there for a reason.

At the end of the day no matter how good anyone thinks a 3080 is, it's not bullet proof due to the vram which runs out on a few occasions, still a great gpu.

The footage from me for RE 2 is using the first version/build I think (cba checking if it is or isn't but 99% sure it is as I got it after tired9 first posts/videos on the issue) unless the issue is now fixed on the 3070 too? But yet again, surely if it happens at just that one spot you and tired9 are referring to (and also in my footage) and is now fixed with a patch.... that indicates something wrong with the game? If it was purely just a lack of vram then you wouldn't be able to fix it and the devs wouldn't bother either if they didn't see it as an "issue".... I do recall of people also stating about having HDR on causing performance issues even on 3080ti/3090s so maybe that was also a cause for any performance issues.

I'll try to get MSFS installed this week and see what is what, played it when it first came on gamepass on my 4k oled and had no issues back then.

No one has disputed your last point either, only pointed out how it doesn't really make much sense to spend an "additional" £750 for an extra 10-15% performance boost in just gaming... "future proofing" is a silly argument as we have all pointed out, that money saved will be able to buy the new gen gpus, which will offer far more.
 
RE2 Remake is just crap full stop. Badly implemented GFX tech like HDR (washed out regardless of how you tweak settings) and RT just doesn't work right. Poorly optimised in general. RE3/RE7 Village etc are all superior for their use of the gfx tech, flawless in all regards.
 
RE2 Remake is just crap full stop. Badly implemented GFX tech like HDR (washed out regardless of how you tweak settings) and RT just doesn't work right. Poorly optimised in general. RE3/RE7 Village etc are all superior for their use of the gfx tech, flawless in all regards.
You know why that is, RE3 and RE7 are both AMD sponsored titles.
 
RE2 Remake is just crap full stop. Badly implemented GFX tech like HDR (washed out regardless of how you tweak settings) and RT just doesn't work right. Poorly optimised in general. RE3/RE7 Village etc are all superior for their use of the gfx tech, flawless in all regards.

IIRC didn't you also mention about performance issues specifically when you enable HDR? That's with a 3080ti at 3440x1440 too?
 
Oh yeah it dropped to around 44fps (sometimes less) randomly when both RT/HDR were enabled - A bit annoying but the washed out colour grade in HDR on an OLED just didn't fly.
 
Not at 1000nits, the art style is completely different and suits the cyberpunk future city lights etc, RE2 is a completely different type of game that relies on pitch black scenes which look off when everything is off-shade black!
 
*puts on debunking hat*

:p

So after some testing and research, seems Tommy you need to complain to Microsoft about their implementation of dx 12, assuming you are trying to use this? :p


Tried dx 12 on my end and it is a **** show regardless of settings, after some googling, turns out, loads of people saying the same regardless of gpu, here 2 videos from both the 3090 and 6900xt to rule out vram & brand:

3090:


6900xt:


Suppose dx12 is still in "beta" so can't complain too much.

Here is my video of the 3080 with "ultra" preset/settings in New York as I imagine this would be the most demanding area in the world for vram usage and also grunt? (ignore the average fps figure, obviously it isn't right given fps was in the 40s for most of the time....)


Sadly I couldn't find any amd footage on New york (with fps etc. overlay and a somewhat recent "build") but alas, even in areas, which wouldn't be as demanding as NY, looks to me that a 3080 is holding up pretty well but obviously not like for like scenarios so make of that what you will.... Oh and before people say "zOMG texture/asset pop in!!!!", it is also there in 3090 and 6900xt footage too ;)



So to conclude, I don't know about anyone else but to me it looks like once again, we don't have enough grunt on any gpu to have an enjoyable experience with ultra @ 4k in this when even the 3090 and 6900xt are usually in 40s with dips in 30s.


I was going to test dlss too but for some reason, the setting isn't showing on my end, seems I'm not alone though:




Next please :p
 
*puts on debunking hat*

:p

So after some testing and research, seems Tommy you need to complain to Microsoft about their implementation of dx 12, assuming you are trying to use this? :p


Tried dx 12 on my end and it is a **** show regardless of settings, after some googling, turns out, loads of people saying the same regardless of gpu, here 2 videos from both the 3090 and 6900xt to rule out vram & brand:

3090:


6900xt:


Suppose dx12 is still in "beta" so can't complain too much.

Here is my video of the 3080 with "ultra" preset/settings in New York as I imagine this would be the most demanding area in the world for vram usage and also grunt? (ignore the average fps figure, obviously it isn't right given fps was in the 40s for most of the time....)


Sadly I couldn't find any amd footage on New york (with fps etc. overlay and a somewhat recent "build") but alas, even in areas, which wouldn't be as demanding as NY, looks to me that a 3080 is holding up pretty well but obviously not like for like scenarios so make of that what you will.... Oh and before people say "zOMG texture/asset pop in!!!!", it is also there in 3090 and 6900xt footage too ;)



So to conclude, I don't know about anyone else but to me it looks like once again, we don't have enough grunt on any gpu to have an enjoyable experience with ultra @ 4k in this when even the 3090 and 6900xt are usually in 40s with dips in 30s.


I was going to test dlss too but for some reason, the setting isn't showing on my end, seems I'm not alone though:




Next please :p
Ahh a one trick pony, TBH it is not a surprise that magically MSFS turns up as "Not enough Vram" After the entire time of it being out.
 
Ahh a one trick pony, TBH it is not a surprise that magically MSFS turns up as "Not enough Vram" After the entire time of it being out.

Indeed :cry: I did show this with several sources earlier on but apparently they were using a "different build" so don't count.... :p


Looking back at those comparisons in the above link, seems performance is much the same as release, maybe somewhat improved? Again, hard to say when I'm testing a dense/very built up city such as NY and others test less dense/built up areas.....
 
Who needs retinas anyway? :p

I think you may have mental problems if you can stand watching 1000 nits.

You two don’t seem to understand how it works :p

It is not 1000 nits for the whole screen at once, the monitor or tv could not do that even if it wanted too. It is for a small percentage of the screen. So say for example in a scene there is fire, that fire can be 1000 nits, but the rest of the screen 150 nits. This has the effect of making the fire stand out and pop which looks awesome.

A game that I found HDR making a huge difference which no one talks about and even I forget at times is Mortal Kombat 11. The one that stood out to me in a big way was Raiden. His lighting really pops off the screen. The light must be 1000 nits on those I recon and it is very bright. But what percentage of the screen is that on? 2% or maybe less? Hope that helps explain it :D
 
You two don’t seem to understand how it works :p

It is not 1000 nits for the whole screen at once, the monitor or tv could not do that even if it wanted too. It is for a small percentage of the screen. So say for example in a scene there is fire, that fire can be 1000 nits, but the rest of the screen 150 nits. This has the effect of making the fire stand out and pop which looks awesome.

A game that I found HDR making a huge difference which no one talks about and even I forget at times is Mortal Kombat 11. The one that stood out to me in a big way was Raiden. His lighting really pops off the screen. The light must be 1000 nits on those I recon and it is very bright. But what percentage of the screen is that on? 2% or maybe less? Hope that helps explain it :D
1000 nits and no retinas or chopping off your hair to get rid of 1000 pesky itchy nits.
 
Status
Not open for further replies.
Back
Top Bottom