• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Obviously some games are more demanding than others but on balance games coming out around the same time as this generation of cards or newer increasingly will see the 3070 struggle at 4K - even though it can play many recent or older titles at 4K at great frame rates.
That is self-explanatory, but new games are a fraction of the proportion of the PC's immense back catalogue even from only the last 5 years. There are a ton of graphically impressive and amazing games to play at 4k without touching any new stuff for a long time.

So yeah, to say the 3070 "isn't a 4k card" is something that always makes me cringe because it is such a disingenous, vague, lazy and inaccurate statement.
 
You can game at 4K on a mid range card but just be prepared to turn down a few settings whether it's due to VRAM or raster power it was the same with a 1070 or 2070.
 
I'll be trying a 4k monitor soon but I'll see how i get on with it before deciding if i send it back. Really fairly happy with my current 1440p 24" 165hz screen tbf, 24-27" is my ideal range for having everything in sight for FPS games etc.

Going 4k basically means having to buy every new gen high end card that comes out though, if vram is an issue with the 3080 then settings obviously have to be lowered.
 
Going 4k basically means having to buy every new gen high end card that comes out though

No it does not, especially now that DLSS is rolling out to the majority of new games. DLSS gives close to a generational increase in performance to current cards, so it is more than possible to have a 3090 or 3080Ti and have it last through the next generation before really needing to upgrade again.
 
When 8gb vram isn't enough and the 3060 wipes the floor with the 3070

I can still hear Jensen's voice telling us the 3070 is faster than the 2080ti, hahahhaahahha

image-23.png
I really hope that the artist on MLIDs is right and developers just assumes that a majority of gamers in the next 3 years are running 3060 and says tough luck to everyone else.
 
No it does not, especially now that DLSS is rolling out to the majority of new games. DLSS gives close to a generational increase in performance to current cards, so it is more than possible to have a 3090 or 3080Ti and have it last through the next generation before really needing to upgrade again.
3090's are stupid money though. DLSS certainly helps, i guess as long as the card below the top is enough performance wise then it doesn't matter much to me anyway. Hopefully Nvidia will be more generous with the vram next time if possible.
 
That is self-explanatory, but new games are a fraction of the proportion of the PC's immense back catalogue even from only the last 5 years. There are a ton of graphically impressive and amazing games to play at 4k without touching any new stuff for a long time.

So yeah, to say the 3070 "isn't a 4k card" is something that always makes me cringe because it is such a disingenous, vague, lazy and inaccurate statement.

I think you are missing what pretty much everyone else understands - that it is a relative term for the generation and tier, which might change in the future as resolutions increase and GPUs target different things but that isn't where we are now.

Same as the 3090 generally "isn't" a 1080p card.

You are getting your knickers in a twist for no good reason.
 
I think you are missing what pretty much everyone else understands - that it is a relative term for the generation and tier, which might change in the future as resolutions increase and GPUs target different things but that isn't where we are now.

Same as the 3090 generally "isn't" a 1080p card.

You are getting your knickers in a twist for no good reason.
My knickers aren't in a twist and my logic hasn't changed. It will run many games at 4k.
 
That is self-explanatory, but new games are a fraction of the proportion of the PC's immense back catalogue even from only the last 5 years. There are a ton of graphically impressive and amazing games to play at 4k without touching any new stuff for a long time.

So yeah, to say the 3070 "isn't a 4k card" is something that always makes me cringe because it is such a disingenous, vague, lazy and inaccurate statement.

This has always bothered me, not just about the 3070 but more generally with this so called standard of what is "4k capable". The simple fact is that games range in their demands from the GPU, from very small to very large, and if you set your standard such that any game with max settings in 4k can't reach playable frame rate, then it's not a 4k card, then ZERO cards are 4k cards.

This is because some settings in computer rendering have no real upper limit on how they scale. Some of course do, you can't have textures higher quality than the highest quality textures provided in the game assets. But things like draw distance, LOD distances, screen resolution, anti-aliasing and so much more, these all kind of don't really cap out at a maximum threshold. Often they might in a graphics menu because sometimes these things are sliders or have several discreet values to pick from, so the users choice is sometimes limited. Fundamentally what decides these values in the graphics menu is what is realistic to use at the time given the current hardware and upcoming hardware. If you set them too high gamers will complain they can't run in "max settings" and that the game is "unoptimized".

My experiments pushing RDR2 to its limits recently was interesting, using a combination of render scale, which essentially is a type of SSAA which renders the game at a resolution higher than your monitor, then downsamples it, added to regular MSAA, which takes multiple sub-pixel samples, you could drive the games performance demands and memory usage way past that of a 3090. Not just at 4k but also at 1080p. If our standard is that you're not a 4k card unless you can run all games maxed in 4k then no cards are 4k ready and no cards are even 1080p ready. This highlights how stupid of a standard this really is.

It needs to be qualified somehow, and how you do that, god only knows. As Richdog says PC gaming has been around for along time, my steam account now lists 1055 games, all of which I play in 4k which settings maxed, with the caveat of not using something like resolution scaling or SSAA/MSAA. And the caveat of games with RT I use DLSS to achieve 4k. So it's nuanced and when people ignore that nuance it really bothers me, it's normally motivated by making some other stupid point.
 
I think you are missing what pretty much everyone else understands - that it is a relative term for the generation and tier, which might change in the future as resolutions increase and GPUs target different things but that isn't where we are now.

Same as the 3090 generally "isn't" a 1080p card.

You are getting your knickers in a twist for no good reason.

No, I think he understands that. But despite that taken into account for, if your standard is that exceptions set the rules then no card is 1080p capable much less 4k capable, because there are exceptions where you can push limits of current hardware, on even games that are kind old by today's standards. This is what this thread has mostly been about, desperately hunting for that 1 single game that disproves the rule so people can go "AH HAH, see I told you so".

In reality most people have some set of complex rules in their head that define what is and is not 4k capable or 1080p capable, and everyone's rules will likely differ so it's a discussion that just devolves into argument unless everyone involved first agrees on the rules, and no one ever does.
 
Yeah. I don’t give two hoots if people think a 3070 is not a 4K graphics card. It will play pretty much all my games bar a few just fine at 4K 60fps and I am fine with that until next gen cards come out next year :D

Not bad for £480 considering some are paying around £400 for second hand 1080Ti’s still :cry:
 
No, I think he understands that. But despite that taken into account for, if your standard is that exceptions set the rules then no card is 1080p capable much less 4k capable, because there are exceptions where you can push limits of current hardware, on even games that are kind old by today's standards. This is what this thread has mostly been about, desperately hunting for that 1 single game that disproves the rule so people can go "AH HAH, see I told you so".

In reality most people have some set of complex rules in their head that define what is and is not 4k capable or 1080p capable, and everyone's rules will likely differ so it's a discussion that just devolves into argument unless everyone involved first agrees on the rules, and no one ever does.

There is always going to be some subjectiveness to it - but few are going to buy a 3060/70 class card with the intention of playing the latest and greatest games at 4K with quality setting which compliment that. Sure they can play older games or newer ones at reduced quality setting at 4K but it is always a relative position - it isn't about the exceptions.

Yeah. I don’t give two hoots if people think a 3070 is not a 4K graphics card. It will play pretty much all my games bar a few just fine at 4K 60fps and I am fine with that until next gen cards come out next year :D

Not bad for £480 considering some are paying around £400 for second hand 1080Ti’s still :cry:

Try CP2077 at 4K with all the bells and whistles - to get decent frame rates you need DLSS performance mode and then the quality drop you might as well play at 1440p. Many games contemporary with the card at higher quality settings will be dipping enough below 60 FPS it isn't ideal.
 
Last edited:
I generally tend to play at 2k on my 2060 which is the 6gb version. Without ray tracing I still get decent enough performance in Doom Eternal and a lot of the other games my sons play are older anyway (Bioshock, Roblox etc). I do favour image quality over FPS but with a gsync monitor it does help smooth everything out.

A 3090 for me is overkill but stock on 3080 and 3080ti means I will have to wait. I would still run a hdmi cable from the pc to the tv if I wanted to test 4k gaming as monitor is only 1440p.
 
There is always going to be some subjectiveness to it - but few are going to buy a 3060/70 class card with the intention of playing the latest and greatest games at 4K with quality setting which compliment that. Sure they can play older games or newer ones at reduced quality setting at 4K but it is always a relative position - it isn't about the exceptions.



Try CP2077 at 4K with all the bells and whistles - to get decent frame rates you need DLSS performance mode and then the quality drop you might as well play at 1440p. Many games contemporary with the card at higher quality settings will be dipping enough below 60 FPS it isn't ideal.


Even a 3090 can drop below 60fps at points in both Valhalla and CP2077, 3070 will run most games at 4k if you are willing to fiddle with some graphical settings (and even then it will be a very low number of games where you will actually need to do this)
 
Try CP2077 at 4K with all the bells and whistles - to get decent frame rates you need DLSS performance mode and then the quality drop you might as well play at 1440p. Many games contemporary with the card at higher quality settings will be dipping enough below 60 FPS it isn't ideal.
I already played that when I had my 3080. I likely won’t play it again until next gen cards. Nice cherry picking though :p

No chance I am going 1440p. There is a big difference in image quality for me. I already have a 1440p 165hz freesync/gsync monitor in the house. 4K is just on another level and in almost all cases I would rather play 4K high settings than play 1440p ultra settings. The gap between high and ultra is typically high in terms of performance penalty, but you need to take stills to see the difference in IQ. Where as with 4K you get the extra IQ by default.

End of the day as I said, I just look at my steam library and what upcoming games I want to play this year and 3070 will deal with almost all of them for my needs. 3080 would obviously be more ideal, but I was too late grabbing that in this drop. 3070 will do fine anyways. Who knows. Might get a 3080 again in the next drop and sell the 3070 to pay for it. Lol.
 
There is always going to be some subjectiveness to it - but few are going to buy a 3060/70 class card with the intention of playing the latest and greatest games at 4K with quality setting which compliment that. Sure they can play older games or newer ones at reduced quality setting at 4K but it is always a relative position - it isn't about the exceptions.

Try CP2077 at 4K with all the bells and whistles - to get decent frame rates you need DLSS performance mode and then the quality drop you might as well play at 1440p. Many games contemporary with the card at higher quality settings will be dipping enough below 60 FPS it isn't ideal.
CP2077 is an unoptimised and buggy POS, we all know that, so using it as a key benchmark of 4k gaming isn't a fair reflection of a cards capabilities as even a 3090 struggles with it. There are a crapton of games that are better optimised and fantastic looking. Many reviews were done on the 3070 in the last months and it generally runs modern 4k games at 60fps or over.
 
because games look horrible and blurry at 1440p render, that's normal

new modern engines need a minimum of 4k resolution pixel information to properly look "clear" and "sharp". anything less fails to deliver a clear experience

it has nothing to do with your monitor resolution either, just push res scale and image will look sharp/clean as it did on 4k screen. it has nothing to do with screen resolution, new game engines needs more pixels to accomodate for TAA in modern games such as rdr 2

playing at 1440p is effectively playing at 720-900p due to valuable pixel information lost to TAA
same at 4K, you're practically playing at old-1440p quality due to valuable pixel information lost to TAA

1080p takes the biggest hit, image will look like 360-540p on motion due to pixel info lost

with TAA, 1080p 1440p and 4k are not their native resolution anymore. it only looks native when you stand still or in still motions. that's why 4k tends to look better because it preserves more pixel info when moving the camera

a 720p game from MSAA-forward engine era will look sharper, cleaner and more refined than a 1440p TAA-deferred engine game.

Go play Witcher 2. or old AC games. or any games that were released before the invention of PLAGUE called TAA. they will look sharp and superior whatever the resolution you play it at. play it with both your 1440p and 4k screen. you will see that there's no practical differences. because each screen will provide a sharp, clean image due to no temporal anti aliasing. 4k on such games will only provide more antialiasing, and thats about it. 4k on modern TAA games is not there to provide AA but it is there to make the image look more clear in motion
 
Last edited:
because games look horrible and blurry at 1440p render, that's normal

new modern engines need a minimum of 4k resolution pixel information to properly look "clear" and "sharp". anything less fails to deliver a clear experience

it has nothing to do with your monitor resolution either, just push res scale and image will look sharp/clean as it did on 4k screen. it has nothing to do with screen resolution, new game engines needs more pixels to accomodate for TAA in modern games such as rdr 2

playing at 1440p is effectively playing at 720-900p due to valuable pixel information lost to TAA
same at 4K, you're practically playing at old-1440p quality due to valuable pixel information lost to TAA

1080p takes the biggest hit, image will look like 360-540p on motion due to pixel info lost

with TAA, 1080p 1440p and 4k are not their native resolution anymore. it only looks native when you stand still or in still motions. that's why 4k tends to look better because it preserves more pixel info when moving the camera

a 720p game from MSAA-forward engine era will look sharper, cleaner and more refined than a 1440p TAA-deferred engine game.

Go play Witcher 2. or old AC games. or any games that were released before the invention of PLAGUE called TAA. they will look sharp and superior whatever the resolution you play it at. play it with both your 1440p and 4k screen. you will see that there's no practical differences. because each screen will provide a sharp, clean image due to no temporal anti aliasing. 4k on such games will only provide more antialiasing, and thats about it. 4k on modern TAA games is not there to provide AA but it is there to make the image look more clear in motion

MSAA uses spatial anti-aliasing, it has a number of quality issues.

Alpha testing
Alpha testing is a technique common to older video games used to render translucent objects by rejecting pixels from being written to the framebuffer. If the alpha value of a translucent fragment is not within a specified range, it will be discarded after alpha testing. Because this is performed on a pixel by pixel basis, the image does not receive the benefits of multi-sampling (all of the multisamples in a pixel are discarded based on the alpha test) for these pixels. The resulting image may contain aliasing along the edges of transparent objects or edges within textures, although the image quality will be no worse than it would be without any anti-aliasing. Translucent objects that are modelled using alpha-test textures will also be aliased due to alpha testing. This effect can be minimized by rendering objects with transparent textures multiple times, although this would result in a high performance reduction for scenes containing many transparent objects.

Aliasing
Because multi-sampling calculates interior polygon fragments only once per pixel, aliasing and other artifacts will still be visible inside rendered polygons where fragment shader output contains high frequency components.

Performance
While less performance-intensive than SSAA (supersampling), it is possible in certain scenarios (scenes heavy in complex fragments) for MSAA to be multiple times more intensive for a given frame than post processing anti-aliasing techniques such as FXAA, SMAA and MLAA. Early techniques in this category tend towards a lower performance impact, but suffer from accuracy problems. More recent post-processing based anti-aliasing techniques such as temporal anti-aliasing (TAA), which reduces aliasing by combining data from previously rendered frames, have seen the reversal of this trend, as post-processing AA becomes both more versatile and more expensive than MSAA, which cannot antialias an entire frame alone.

Temporal anti-aliasing (TAA) is a spatial anti-aliasing technique.

TAA compared to MSAA
Prior to the development of TAA, MSAA was the dominant anti-aliasing technique. MSAA samples (renders) each pixel multiple times at different locations within the pixel and averages the samples to produce the final pixel value. In contrast, TAA samples each pixel only once per frame, but it samples the pixels at a different locations in different frames. This makes TAA faster than MSAA. In parts of the picture without motion, TAA effectively computes MSAA over multiple frames and achieves the same quality as MSAA with lower computational cost.

TAA compared to FXAA
TAA and FXAA both sample each pixel only once per frame, but FXAA does not take into account pixels sampled in past frames, so FXAA is simpler and faster but can not achieve the same image quality as TAA or MSAA.

TAA compared to DLSS
Nvidia's DLSS operates on similar principles to TAA. Like TAA, it uses information from past frames to produce the current frame. Unlike TAA, DLSS does not sample every pixel in every frame. Instead, it samples different pixels in different frames and uses pixels sampled in past frames to fill in the unsampled pixels in the current frame. DLSS uses machine learning to combine samples in the current frame and past frames, and it can be thought of as an advanced TAA implementation.

Red Dead Redemptions 2 DLSS vs TAA vs MSAA. 2:05


DLSS image looks better. MSAA and TAA look close. TAA looks a little softer. DLSS looks sharper and more detailed.

Guilherme Fonseca
1 day ago it seems that the DLSS is with the image much sharper.


Enrico D.
1 day ago
It's 2.2.10, just take the 2.2.6 from techpowerup and replace it, I was confirmed that it removed ghosting and flickering

Thomas
1 day ago Oh wow* I'm really surprised. DLSS looks fantastic. Just look at the first scene. The background is sharper and more detailed. I can finally disable TAA. And DLSS gives you more FPS.

CaptainMcShotgun
1 day ago Better than native!


Fabibi an
1 day ago
This was sooo needed. Finally. TAA makes the game look blurry, FXAA is just...bad and MSAA makes your performance go poop. DLSS saves the day


rockkiller124
2 hours ago
DLSS saved this game's performance entirely without sacrificing a lot of graphical details

Hazmi Fly
19 hours ago (edited)
Easy win for DLSS.. quality almost the same with MSAA and less blurry than TAA..more performance too than both antialiasing.
 
Last edited:
Yeah. I don’t give two hoots if people think a 3070 is not a 4K graphics card. It will play pretty much all my games bar a few just fine at 4K 60fps and I am fine with that until next gen cards come out next year :D

Not bad for £480 considering some are paying around £400 for second hand 1080Ti’s still :cry:


Those people only read reviews and don't understand that at 4k you turn things like AA off or very low. So they read reviews that have apples to apples comparison where everything is set to ultra. In reality you turn off stuff at 4k as many settings are to enhance potato resolutions, which 90% of people play at.
 
Status
Not open for further replies.
Back
Top Bottom