• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I think NVidia would be pushing more vram now but for the shortages. Just to look the same as AMD. Most people just go on one media outlet and dont care if its objectively telling them the truth. So 16GB is best subjectively. FSR is better than DLSS sujectively. AMD cards are faster subjectively. It does not matter if almost every game benchmarked is faster on the AMD CPU or if SAM support in that game made AMD cards faster.

Same with FSR, you get a lot of games that no one plays like terminator.

People will just see 10GB bigger than 16GB and then make up whatever reason they can to justify 10 < 16GB. Afterall a bigger number is better, right.

No one considers AMD or NVidia cards as a performance balance between DXR and raster games. This is because the media turned reviews into DLSS and DXR are irrelevent and thus AMD win. Now that FSR is out, its already better than DLSS and will get better in time. DLSS was ripped apart in pixel by pixel comparison with native resolution images.

With such a baised media reviewing products, 16GB will win over 10GB in the hearts and minds of customers. Reguardless if games need 10GB or more. Must customers will hear their opinion on something like hardware unboxed or someother gaming review channel and never engage their critical thinking or seek a second objective opinion.
You actually think that media is on AMD side. :cry:

Also im pretty certain the part in bold is something you made up
 
What would the extra VRAM be used for?

I've asked this many times, but no one seems to have an answer.

PR war. Take this youtube comment for example.


Elijah1
1 week ago
Tried this on riftbreakers Im impressed with the FPS improvements i got with RTX on 1660TI

This is with reference to a FSR video. He's able to run DXR better and is impressed with the FSR performance increase with his 1660TI.

People see a bigger number as better reguardless if it makes any sense. Because they accept these statements, you cant say anything about it. Its because their ego is so invested in it, a truth or statement that attacks this cannot be reconciled internally. So they lie and attack people. A bigger number can be better and a smaller number can be enough. You have to research it yourself. But bigger is better is so ingrained that you can never argue against it, even if you are telling the truth or not.
 
Last edited:
You actually think that media is on AMD side. :cry:

Also im pretty certain the part in bold is something you made up

Video about FSR quality.

Elijah1
1 week ago
Tried this on riftbreakers Im impressed with the FPS improvements i got with RTX on 1660TI
DXR on a 1660TI with FSR leading the way in performance. He's impressed.


Paul_Sleeping
1 week ago
The saddest thing for me is even watching this video with a nice 4K monitor, I still can't tell the difference on FSR when it comes to picture quality. Text, yes, but that is as far as it goes. Growing old sucks. If I was in my teens or 20s I'm sure I would be able to.


JayT's Tech Reviews
1 week ago
Hearing its right on par with DLSS 2.0. Impressive for a first shot and no specific hardware needed

neogeomaster
1 week ago
Fsr is better

conyo985
1 week ago
Just watched Digital Foundry and I don't agree.

JayT's Tech Reviews
1 week ago
@conyo985 i will keep my opinions neutral until i actually get my hands on FSR. I trust gamers nexus and their opinions and the only bad thing they had to say was about anything below quality setting....either way competition only benefits to consumer so either way its a win for us


fluff fluff
1 week ago
@Violet Unlike DLSS 1.0 that just blur the whole image to oblivion on metro exodus and battlefield . FSR actually retain great amount of quality and I dare say this is almost as good as dlss 2.0 or maybe 1.9.


RobBCactive
1 week ago
As it's open, researchers might use it as a test bed for improved algorithmns, or some teenager might come up with a good idea :)

PLAYER1
1 week ago If you watch Moore's Law is Dead (YouTuber), his sources confirmed early on AMD was very confident with their first iteration of FSR. So, having it confirmed by tech channels is a nice feeling because the ironic thing is AMD is doing everything right compared to Nvidia and their "Ultimate Play" to screw gamers over. This will also benefit the Series X/S and PS5, since they also use AMD GPUs. In fact, this is where it will see the most gains: Moderate hardware with improved performance at the high-end.

J Simmons
1 week ago I tested this in Terminator: Resistance. Even in performance mode i can't even tell the difference. My eyes must be broken.


Condor
1 week ago
Heck yeah I was actually right on the comparisons before they were labeled. Though the Native one was pretty easy to spot since it wasn't keeping a consistent frame rate. It made the other two a little easier to decide on. Just look for the smudginess on edges and in textures details. This was a great video though. The in depth look at it all is fantastic information to munch on.


Jason Fane
1 week ago
I was stoked, sitting on my couch 8 feet away from the 65' 4k TV I was able to determine which demo was native 4k, and which ones were using FidelityFX accurately. I was looking for compression artifacts in the explosions which is what gave it away for me. Great video and breakdown Steve, and that was a fun way to test the viewer.

Cordial Wombat
1 week ago
Ladies and Gentlemen, you can try FSR yourself. Riftbreaker has it on their Steam demo.

First Reply

DirtyMisfit
1 week ago
Looks like dog crap to me

DirtyMisfit
1 week ago @caldark2005 no the game seemed fine, just when I turned fsr on it went to trash, only thing that half looked good was on ultra quality and even that was blurry on your mech, off looked better, running a 6700xt, had ray tracing ultra on to, but wanted to see what fsr could do, not impressed just yet, hopefully they release it for re village some time soon

The video mocks Nvidia products like the RTX 3080 ti, comments point it out.

Tan Kok Onn
1 week ago The random insert of "GEFORCE RTX 3080 TIE" gets me everytime
 
Last edited:
What would the extra VRAM be used for?

I've asked this many times, but no one seems to have an answer.
i've answered this a thousand times,

texture quality has been stagnant for a time, and if nvidia pushes their own 16 gb cards, you can be sure that there will be crazy texture settings that will redefine how the games may look

i mean if u call 3080 being obsolete in just 2 years, ok, thats another discussion i won't be going into. some people have the luxury of upgrading every 2 years. for those, this discussion is pointless as it can be, if you're one of those people, you have no business discussing these topics here. this topic is solely for beyond 2 years of usage. say 4 years later, if these nextgen textures become mainstream, rx 6800xt will be able to handle them, 3080 won't.
 
i've answered this a thousand times,

texture quality has been stagnant for a time, and if nvidia pushes their own 16 gb cards, you can be sure that there will be crazy texture settings that will redefine how the games may look

i mean if u call 3080 being obsolete in just 2 years, ok, thats another discussion i won't be going into. some people have the luxury of upgrading every 2 years. for those, this discussion is pointless as it can be, if you're one of those people, you have no business discussing these topics here. this topic is solely for beyond 2 years of usage. say 4 years later, if these nextgen textures become mainstream, rx 6800xt will be able to handle them, 3080 won't.

We already have 4k texture packs. The 3080 does fine with them. Are you suggesting we are going to go 8k and the already out of date 6800xt will handle them due to having more VRAM?
 
We already have 4k texture packs. The 3080 does fine with them. Are you suggesting we are going to go 8k and the already out of date 6800xt will handle them due to having more VRAM?

a game can provide a 4k texture pack and still look bad. the textures are one and done deal, they just do some tricks to provide an additional layer of textures. textures themselves still bland and bad. wd:legion, a prime example, textures still look currentgen even with 4k texture pack. do they stand out? no. monster hnter world. 4k texture pack. is it standing out compared to other games? a big no...

i think you're missing the point in this discussion, i'm not talking about 4k textures, i'm talking about textures being more detailed. you can have crap looking textures at 4k, so what? metro exodus ee supposedly have enhanced textures for 4k, but it still falls flat in many cases. it is clear that they just did some trick/upscaling hack and called it a 4k texture pack

rdr 2 consumes 7-8 gb vram at 4k and yet ground still looks bad. how do you think you will push more detailed textures on that game when it already maxes out all the gpus out there with its currentgen textures=?

or re village, for that matter, that pushes 9.5-9.6 gb vram usage with its bland looking textures. how do you think capcom can push texture quality further with this amount of vram? i mean is it even possible? if re village has 4k textures, then they're pretty much damned. it looks horrible no matter what you do. the illusion of beautiful image breaks away once you start noticing the horrible oldgen textures all around

no need to derail the thread.

list me some of those "4k texture pack" games... i wait.. they won'T look any better than RDR 2, last of us 2, or any other currentgen game for that matter. why you wonder?
 
Last edited:
a game can provide a 4k texture pack and still look bad. the textures are one and done deal, they just do some tricks to provide an additional layer of textures. textures themselves still bland and bad. wd:legion, a prime example, textures still look currentgen even with 4k texture pack. do they stand out? no. monster hnter world. 4k texture pack. is it standing out compared to other games? a big no...

i think you're missing the point in this discussion, i'm not talking about 4k textures, i'm talking about textures being more detailed. you can have crap looking textures at 4k, so what? metro exodus ee supposedly have enhanced textures for 4k, but it still falls flat in many cases. it is clear that they just did some trick/upscaling hack and called it a 4k texture pack

rdr 2 consumes 7-8 gb vram at 4k and yet ground still looks bad. how do you think you will push more detailed textures on that game when it already maxes out all the gpus out there with its currentgen textures=?

or re village, for that matter, that pushes 9.5-9.6 gb vram usage with its bland looking textures. how do you think capcom can push texture quality further with this amount of vram? i mean is it even possible? if re village has 4k textures, then they're pretty much damned. it looks horrible no matter what you do. the illusion of beautiful image breaks away once you start noticing the horrible oldgen textures all around

no need to derail the thread.

list me some of those "4k texture pack" games... i wait.. they won'T look any better than RDR 2, last of us 2, or any other currentgen game for that matter. why you wonder?

In such games it's not that the textures are bad, but how the engine uses them. They are stretched and squeezed to fit the topology.
 
In such games it's not that the textures are bad, but how the engine uses them. They are stretched and squeezed to fit the topology.

i hope u're right then, but i eagerly wait to see things improve in games. for like 5 years textures are the same across all games... ps3 to ps4 was a huge leap in terms of texture quality in games... this can be observed between tlou1 and tlou2... i would expect 16 gb buffer to enable never seen before type of textures in games but we shall see with time
 
In such games it's not that the textures are bad, but how the engine uses them. They are stretched and squeezed to fit the topology.
Erm what?

Unwrapping and mapping of textures is done by a person. Even in instances of procedually generated textures these are tweaked by an artist.

If a game engine is altering textures fitment onto topology after the artist has layed them out then engine is broken because i don't see a reason why it should do that.
 
Last edited:
Video about FSR quality.

I'm assuming we are talking about DLSS 2.0. Unless you won't to lay out arguments as to why DLSS 1.0 is better than FSR.

There is only one comment on that list that states FSR is better than DLSS 2.0 and that is from some random nobody. But i guess i was technically wrong; you didn't make it up.
The rest of the comments are literally irrelevant.

The video mocks Nvidia products like the RTX 3080 ti, comments point it out.

So what, when did Nvidia become immune from being mocked? Especially on a cash grab product.
 
Erm what?

Unwrapping and mapping of textures is done by a person. Even in instances of procedually generated textures these are tweaked by an artist.

If a game engine is altering textures fitment onto topology after the artist has layed them out then engine is broken because i don't see a reason why it should do that.

It's a game engine that the artist uses to create the topology in the first place. That artist doesn't have time to check every aspect design as the engine does a good enough job on its own. For example, you must have seen artists creating a forest using different brush sizes to do so placing many trees of differing types at once. Do you really think they stop and check every tree?
 
It's a game engine that the artist uses to create the topology in the first place.
If by topology you mean the polygons on the model, then no just no. A vast majority of models will have their topology created by an artist. The only use case that comes to mind where topology is created by an algorithm is simulation data and tessellation. And even tessellation can be manipulated by an artist by altering the base topology.

If by topology you mean something else, then please explain so i can understand?


That artist doesn't have time to check every aspect design as the engine does a good enough job on its own.
:cry::cry::cry::cry:

That's how you end up with games filled with bugs, see CP2077.


For example, you must have seen artists creating a forest using different brush sizes to do so placing many trees of differing types at once. Do you really think they stop and check every tree?

those trees could have been made by a different artist who did review them to make sure they look right. Or they could be procedually generated trees they would have been tested to ensure the correct output.

In the example you're talking about, it is probably about five unique trees that have random rotation and scale to make it look like a forest of unique trees.
 
AMD, quite rightly so, has little market share and the number of Nvidia cards with more than 10GB of VRAM isn't worth considering. Does it make sense that game developers would suddenly target such a niche market? Perhaps these game developers you have been talking to are fresh of the bus.

Yes Nvidia has a huge market share compared to AMD but 10GB 3080 should last for a bit say a year. The developers are from major software houses!
They (the bosses) are pushing for more eye candy to draw more gamers to their titles when they show off their video trailers so the old 8GB Vram standard has to go.
 
Sure, but you can do that predicatively. That's why modern game engines zone games into areas, they store a list of what can be seen from 1 zone to all other zones and at what LOD those objects would be in. They look at player movements and position and infer what will be needed in the next zone and can pre-load this stuff before it's needed. If you do all of this well then you can have many, many more times the assets in a playable area than can be held in vRAM and the assets are just swapped in and out as you play.

Me and LtMatt had this out over FarCry5 way back in the thread. I think he used a 16Gb AMD card to play FC5 in the same settings as me, and I recorded myself on a 3080 in a helicopter flying around the map, jumping out, getting into fights, and then back in and flying to other parts of the map, all in rapid succession. Doing the same in a jeep, cruising about at high speed across the entire open map. And you can see the memory usage graphed on my video, its rapidly swapping between 5GB and 7GB usage as the different areas stream in and out different textures and it's no problem.

Yep you can do it predictively to some extent, but its still not a cure for not enough Vram.
Texture streaming is good for certain situations such as flight sims.
 
Status
Not open for further replies.
Back
Top Bottom