• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
There is no competition between PC and consoles, the console games win GOTY every time while games on PC...
The PC has great potential but for some reason no one is making great games for it anymore. Probably because the most talented game developers were already bought by Sony and MS and are making games for consoles.
This generation of consoles will also be fine since they are more powerful than the old gen and the games made in the last years for the old gen were some of the best looking games ever. If AMD FSR will be any good that will also help the consoles.
The 3080 owners will also be fine. They will use DLSS, FSR or drop a few settings and that's it, if the games will go above the 10Gb, all of us have dropped some settings sometime in the past to be able to play a game or two. I am more worried about the cost of the next gen of videocards for PC, the cost of PC hardware is insane atm it's like in the 90's again. Probably soon it will be like in the 80s, you will pay tens of thousands for a PC. :)
 
So this is what happened. He made a video comparing the performance of devil may cry 5 ported to Ps5 with 120hz mode unlocked to the pc performance of the game with a gtx1060.

with that and just that he came to the conclusion the ps5 is a gtx1060

it's one of the worst videos he's made in a long time

Agree that video was a bit of a joke and he's slowly becoming a youtube drama queen and getting annoying. Seems the youtube trend these days all out for the drama clickbait.
 
So this is what happened. He made a video comparing the performance of devil may cry 5 ported to Ps5 with 120hz mode unlocked to the pc performance of the game with a gtx1060.

with that and just that he came to the conclusion the ps5 is a gtx1060

it's one of the worst videos he's made in a long time


Yeah, I tried to address that in brief (though not well) in my last comment, ie 'comparisons aren't as straight forward' etc.
I mean, for starters... and correct me if I'm wrong (always a possibility btw) but don't the new consoles have assets that can be finagled to some extent to serve a game/app more flexibly that a PC? (iirc the last gen ones could to some extent)
If that is the case, then overall comparisons would be even harder to make stick, surely?
But yeah, it's a hard throw to cut PS5 (or XBSX) potential that far back, certainly placing them on par with my last desktop (6700K, 1070) which might have matched the PS4 Pro/XB1X. I'll admit that I heard Steve out (and GN are usually on point) while thinking 'can it be that bad?'

Again, as I mentioned elsewhere, nothing against consoles here. Had a PS4/XB1 last (aside from the admittedly barely used Switch I have now) until 2015, before going full on PC build (PS4 was the better one imo) Gf wants a PS5 eventually and I'm down if I can play the exclusives from PS3/4 that I missed (Uncharted series, Spidey, GoT etc) unless they make their way to PC first, which is way more likely these days than Total War etc on console.
 
all of us have dropped some settings sometime in the past to be able to play a game or two.c


Oh, for sure.

I've spent the last 18 months with an unfortunately modern trend of thinner gaming laptop.
Could play AAA games maxed out, good specs for 1080p ultra tbh, but tend to cap fps quite hard due to not liking rocket engine noise and heat only 2 feet away...

Can't wait for new deskie to arrive :)
 
Lol, comparing 1060 to new consoles... Have no idea why such a famous Youtuber is spouting that kinda nonsense, he didnt even got the games comparison right since DMCV runs at a higher resolution than the one he claimed, basically zero homework done but i keep seeing plenty of people bringing his vids as arguments so spouting ******** clearly works.

Im afraid the more it happens the more the followers will have to sit up one day and DYOR. Some of these tubers generate the hype or poison the minds with very tainted perspectives. As you say, if you read it often enough it gets adopted as a fact which is just as worrying..
 
I'm pretty certain that this isn't true

Well on most rough gauges of performance, yes it really is true. 12 TFLOPS in the Xbox X and about 10.3TFLOPS in the PS5, where as a 3080 sports about 29.7TFLOPS of a 3080. As for things like dedicated RT cores the consoles fall short on that too with no where near enough ray tracing performance do most visual effects in real time as their APUS are built around AMDs designs for the PC and so suffer the same lack of any real RT horsepower.

And this follows a general rule of thumb for consoles which is on the day they launch to the public they typically use hardware modeled and customized off the prior generation of PC GPUS, and sport a GPU which is something like the mid range from the last gen. So they launch at the very best, middle of the road, when compared to the PC spectrum. We can expect based on prior developer behaviour that for the next 2-3 years they will focus on releasing a 1 size fits all game for their multi-platform release where the PC has no special attention in addition to the consoles with the possible exception of some additional ray tracing features if they're easy to implement. And so the performance demands on the PC for the next 2-3 years is going to remain fairly static for most of our games, with obviously a few exceptions.

This is just how things have gone ever since multi-platform titles became a thing, and development of PC games became tied into the business models of consoles. The 3080 and AMD equivalent high end will remain capable cards for a while yet just because of that synchronization with the console cycle.
 
Well on most rough gauges of performance, yes it really is true. 12 TFLOPS in the Xbox X and about 10.3TFLOPS in the PS5, where as a 3080 sports about 29.7TFLOPS of a 3080. As for things like dedicated RT cores the consoles fall short on that too with no where near enough ray tracing performance do most visual effects in real time as their APUS are built around AMDs designs for the PC and so suffer the same lack of any real RT horsepower.

And this follows a general rule of thumb for consoles which is on the day they launch to the public they typically use hardware modeled and customized off the prior generation of PC GPUS, and sport a GPU which is something like the mid range from the last gen. So they launch at the very best, middle of the road, when compared to the PC spectrum. We can expect based on prior developer behaviour that for the next 2-3 years they will focus on releasing a 1 size fits all game for their multi-platform release where the PC has no special attention in addition to the consoles with the possible exception of some additional ray tracing features if they're easy to implement. And so the performance demands on the PC for the next 2-3 years is going to remain fairly static for most of our games, with obviously a few exceptions.

This is just how things have gone ever since multi-platform titles became a thing, and development of PC games became tied into the business models of consoles. The 3080 and AMD equivalent high end will remain capable cards for a while yet just because of that synchronization with the console cycle.


Can't compare teraflops between different architectures and then try to extrapolate gaming performance form it, unfortunately
 
People keep saying that but I've never seen any evidence of it ever happening. Feel free to show me an example.

And some people will try to bring up console exclusives, that's not a comparison because the game isn't on PC to compare for one and secondly it's funded by Sony who has endless pockets, not representative of the wider industry and thirdly having to only develop for a single machine is a huge benefit, you save a ton of resources on programmers and instead put that money into animation, motion capture etc and so once again it has absolutely nothing to do with console vs pc. PC has produced it's own exceptional exclusive games, it just doesn't have an industry body funding it to the point where it makes any financial sense - Epic games is about the closest thing we have to Sony for PC at this point, they do provide funding if you go to them saying you want to make an exclusive game, however their funding is miniscule compared to Sony (epic will give you $10-20mill, Sony will give you $100-200mill)

Devs have squeezed out a lot of performance from the PS4 as can be seen by the Last of Us 2, Ghost of Tsushima. You could say these games can be done on PC easily but lets see an old GTX 750 or 960 match that with a weak AMD Bulldozer class cpu with half the clockspeed. In a few years I expect PS5/XSX to showcase games which are far better than the current titles.
 
Well on most rough gauges of performance, yes it really is true. 12 TFLOPS in the Xbox X and about 10.3TFLOPS in the PS5, where as a 3080 sports about 29.7TFLOPS of a 3080. As for things like dedicated RT cores the consoles fall short on that too with no where near enough ray tracing performance do most visual effects in real time as their APUS are built around AMDs designs for the PC and so suffer the same lack of any real RT horsepower.

And this follows a general rule of thumb for consoles which is on the day they launch to the public they typically use hardware modeled and customized off the prior generation of PC GPUS, and sport a GPU which is something like the mid range from the last gen. So they launch at the very best, middle of the road, when compared to the PC spectrum. We can expect based on prior developer behaviour that for the next 2-3 years they will focus on releasing a 1 size fits all game for their multi-platform release where the PC has no special attention in addition to the consoles with the possible exception of some additional ray tracing features if they're easy to implement. And so the performance demands on the PC for the next 2-3 years is going to remain fairly static for most of our games, with obviously a few exceptions.

This is just how things have gone ever since multi-platform titles became a thing, and development of PC games became tied into the business models of consoles. The 3080 and AMD equivalent high end will remain capable cards for a while yet just because of that synchronization with the console cycle.
You've made assumptions and guesses in this post and that are wrong.

This biggest crime of them all was assuming that all TFLOPs are equal.
 
Last edited:
You've made assumptions and guesses in this post and that are wrong.

This biggest crime of them all was assuming that all TFLOPs are equal.

Exactly. The 6800XT is around 20.7TFlops and the 3080 is 29.7Tflops yet we see the 6800XT is faster in many games, especially those that are developed in partnership with AMD. I expect consoles to perform similar to a 3060Ti/2080 and over time surpass the 3070/2080Ti. The 2080ti can do 13.5Tflops which is not much more than the XSX.
 
Yeah indeed, Ampere tflops are misleading

2080ti has 13.45 tflops
3070 has 20.21 tflops

so it's clear that the number is bloated, at least for "games". i dont know about any other applications. maybe there is really some fp32 heavy application where ampere shines more

but for games, i would like to cut it by half and multiply it by 1.3 times, since that is how much every Ampere core performs in games compared to Turing

(20.21/2)*1.3 = 13.13 tflops, now that is more like it
or
(30/2)*1.3 = 19.5 tflops, now this is also more like it and more inlined with the 6800xt (20.7 tflops) which 3080 actually comptes
 
Devs have squeezed out a lot of performance from the PS4 as can be seen by the Last of Us 2, Ghost of Tsushima. You could say these games can be done on PC easily but lets see an old GTX 750 or 960 match that with a weak AMD Bulldozer class cpu with half the clockspeed. In a few years I expect PS5/XSX to showcase games which are far better than the current titles.
Yeah.

Bruh. Check these videos out.

https://www.youtube.com/watch?v=JxUPJdcChzE

https://www.youtube.com/watch?v=gGf4SVWEw2g

https://www.youtube.com/watch?v=9OE2iI7OLh8

Wish he had the guts to re-iterate these comparisons for modern games. But then the VRAM argument kicks in, which I can't argue. But at the same time, it is the same as buying a 3070/3080. Buying a 2 GB 770/3 GB 780 in a place where PS4 and Xbone slammed 8 GB to the desk is the same as buying 8/10 GB 3070/3080 in a place where PS5/XsX slamming 16 GB to the desk

So in the end, neither 770 nor 780 and not even 780ti was enough to play nextgen games that PS4 played gracefully, and that will probably what happen with 3070/3080

By the time actual quality nextgen games hit (i gather 2023, 2024) 3070 and 3080 will obsolete in terms of vram and by that time, they will have played only mediocre games with better quality and call themselves a PCMR and then proceed to buy a new GPU and spend 3-4 more times than a console. Welcome to the "enthustiastic" world of PC gaming I guess

Or just buy a 6800xt and at least you will have guaranteed RDNA2 optimizations like RX 480/RX580 had :)
 
This debate seems to have veered off more into 3080ti and console comparisons but my take on VRAM is:

  • VRAM is a relatively easy ceiling to avoid, as there are usually specific settings that reduce VRAM usage without massively impacting other visual quality effects (e.g. most games have a textures setting that directly correlates to VRAM). Just because you can make games use lots of VRAM by hiking up settings, doesn't mean it's good bang for buck. It's not like other aspects of gaming where you have poor performance and are powerless to do much about it without crippling the visuals if you can even fix it at all (maybe a poorly threaded engine hitting a cpu bottleneck)
  • Many cards run out of puff in general at high settings in demanding games (lets say 4k with some AA) so whether they are low on VRAM or not becomes a bit of a moot point as you wouldn't run at those settings anyway
  • On the flipside, the RTX 3080 is a high end card so the sort of compromises people like myself might make on textures etc might get a few backs up from the hardcore enthusiasts dropping £650+ on a card. It's more plausible that a RTX3080 might hit a VRAM limitation first when compared to the usual tranche of average cards as you might actually be able to drive high resolution
 
This debate seems to have veered off more into 3080ti and console comparisons but my take on VRAM is:

  • VRAM is a relatively easy ceiling to avoid, as there are usually specific settings that reduce VRAM usage without massively impacting other visual quality effects (e.g. most games have a textures setting that directly correlates to VRAM). Just because you can make games use lots of VRAM by hiking up settings, doesn't mean it's good bang for buck. It's not like other aspects of gaming where you have poor performance and are powerless to do much about it without crippling the visuals if you can even fix it at all (maybe a poorly threaded engine hitting a cpu bottleneck)
  • Many cards run out of puff in general at high settings in demanding games (lets say 4k with some AA) so whether they are low on VRAM or not becomes a bit of a moot point as you wouldn't run at those settings anyway
  • On the flipside, the RTX 3080 is a high end card so the sort of compromises people like myself might make on textures etc might get a few backs up from the hardcore enthusiasts dropping £650+ on a card. It's more plausible that a RTX3080 might hit a VRAM limitation first when compared to the usual tranche of average cards as you might actually be able to drive high resolution
That's not the case with modern games anymore. At least with the games I know.

RDR2 texture settings simply do not scale, they only give you 100 200 mb in return and texture quality difference between high and ultra is so enormous that you could puke. You're giving away HUGE texture quality compromise for a measly 200 MB vram reduction. It's practically Rockstar's way of saying "you either have enough VRAM or gtfo"

I keep parroting about RDR2 but it's the only current gen game that mattered most for me. It's simply the best game that was released on PC in the entirety of the generation. Any other games, I don't care much. Shadow of Tomb Raider is another offender but has bland story so yeah

I experienced the same in Godfall with the 3070. "Lowering" textures or "dialing settings" did not help. So, there your goes your argument I guess.

Once you're out of vram severely, there's no setting to dial down to save you xd
 
Yeah.

Bruh. Check these videos out.

https://www.youtube.com/watch?v=JxUPJdcChzE

https://www.youtube.com/watch?v=gGf4SVWEw2g

https://www.youtube.com/watch?v=9OE2iI7OLh8

Wish he had the guts to re-iterate these comparisons for modern games. But then the VRAM argument kicks in, which I can't argue. But at the same time, it is the same as buying a 3070/3080. Buying a 2 GB 770/3 GB 780 in a place where PS4 and Xbone slammed 8 GB to the desk is the same as buying 8/10 GB 3070/3080 in a place where PS5/XsX slamming 16 GB to the desk

So in the end, neither 770 nor 780 and not even 780ti was enough to play nextgen games that PS4 played gracefully, and that will probably what happen with 3070/3080

By the time actual quality nextgen games hit (i gather 2023, 2024) 3070 and 3080 will obsolete in terms of vram and by that time, they will have played only mediocre games with better quality and call themselves a PCMR and then proceed to buy a new GPU and spend 3-4 more times than a console. Welcome to the "enthustiastic" world of PC gaming I guess

Or just buy a 6800xt and at least you will have guaranteed RDNA2 optimizations like RX 480/RX580 had :)

I haven't watched the full videos since I'm at work but they used and i3 4130 and Pentium G3258 for the comparisons but both of those cpu's are 3GHz and more powerful than the PS4 cpu and have a higher IPC. The PS4 has a 1.6GHz AMD Jaguar core which is based on a Bulldozer/Piledriver architecture. Not really a fair comparison when the cpu will be pushing the gpu to the max for the pc but the PS4 could be cpu limited. The latest PS4 games would be pushing the gpu to the max though.
 
I haven't watched the full videos since I'm at work but they used and i3 4130 and Pentium G3258 for the comparisons but both of those cpu's are 3GHz and more powerful than the PS4 cpu and have a higher IPC. The PS4 has a 1.6GHz AMD Jaguar core which is based on a Bulldozer/Piledriver architecture. Not really a fair comparison when the cpu will be pushing the gpu to the max for the pc but the PS4 could be cpu limited. The latest PS4 games would be pushing the gpu to the max though.
I'm aware, but PS4 also outpaced them, when you look at it

That 2 core CPU probably can't cope with modern games anymore. I don't think it would hold 30 FPS with a smooth frametime in RDR 2. Or, 60 FPS in COD MW and Fortnite (not that PS4 does a great job, but it mostly stays above 50 with somewhat decent frametimes)

Even some 4 core CPUs (non HT) start to really struggle with modern games

But you have a good point, but then again it was always weird that R* didn't push 60 FPS on GTA 5. I feel like they could've

--

Btw these videos actually started the "PS4 lol 750ti beats PS4 lolol consoles are weak" meme that was rampant between PC "Master Race" members.

Same way some dumb PC users will claim PS5 can perform as low as 1060 because Steve from Gamers Nexus found so in an outlier game.

TBH, everytime digitalfoundry, gamernexus or any other channel open their mouth about these topics, they create unintentional console wars over the forums.

Since DF stated that Series X rt performance is near rtx 2060 in a video, i saw a lot of comments like that.

I'm pretty sure that SX will hold respectable RT performance even 5 years later. But I realistically don't think 2060 will have the same support. So there's no point downplaying consoles with saying they're low midrange gpus
 
Last edited:
That's not the case with modern games anymore. At least with the games I know.

RDR2 texture settings simply do not scale, they only give you 100 200 mb in return and texture quality difference between high and ultra is so enormous that you could puke. You're giving away HUGE texture quality compromise for a measly 200 MB vram reduction. It's practically Rockstar's way of saying "you either have enough VRAM or gtfo"
Fair enough, I'll have a look at RDR2 some time and see if AA impacts it as well. Annoying I didn't realise it was going to be removed from Game Pass, it's on my list of games I want to play (in general it seems I should pay more attention to the comings and goings and play games when available rather than treating it as a backlog extension).
 
Normally, whoever buys a 3080, won't keep it until the end of life for this consoles. Probably will replace it with a 4080 or 5080.
Could this be a problem in some games? Yes, resolution depending. Same goes for 3070. But that's the risk when gaming in 4k.

BTW, RDR 2 is not your average game, so a bit pointless to put that in front as a sort of example for all console games.
 
You've made assumptions and guesses in this post and that are wrong.

This biggest crime of them all was assuming that all TFLOPs are equal.

Exactly. The 6800XT is around 20.7TFlops and the 3080 is 29.7Tflops yet we see the 6800XT is faster in many games, especially those that are developed in partnership with AMD. I expect consoles to perform similar to a 3060Ti/2080 and over time surpass the 3070/2080Ti. The 2080ti can do 13.5Tflops which is not much more than the XSX.

Which is why I called them rough guides to performance, I'm not saying they translate perfectly into performance numbers because there's a huge number of other factors. But more importantly these differences in numbers are not at odds with what we see in reality when we see modern games run on these consoles. The new consoles are not good at high demand games, the AAA games that support "4k" often just use variable resolutions and downsample to 1440p or 1080p when they suffer frame loss, and to achieve 120hz in the few games that support it they often sacrifice even the most basic of scene detail.

If you look at in detail technical breakdowns of latest games on the consoles by a channel like DF you'll see games use variable resolutions and often drop down to quite low resolutions, that settings are dialed right back to achieve things like 120hz mode. That ray tracing to the extent that it exists has extremely aggressive optimizations above and beyond the PC ones to cull out a lot of additional detail. Here are some great in depth videos on this https://www.youtube.com/watch?v=hLUrgHWxWCU and https://www.youtube.com/watch?v=CF9A935XFkU

These machines while impressive leaps over their predecessors are not high end gaming platforms that you'd get from a high end PC dGPU setup. The actual levels of performance analysis of games shows performance clearly much closer to that of a 2060, if you're a gamer, you've used an array of hardware and know what kind of visuals and frame rate to expect from that hardware, and then look at what these consoles are capable of, then the claim that they're closer to a 2060 shouldn't be controversial at all.

GN channel has comparisons of the PS5 to PC builds on a variety of games here https://www.youtube.com/watch?v=HCvE4JGJujk
-DMC5 is a clear win for the PC on a GTX1060
-PS5 slightly beats out a GTX1080 in DIRT 5
-Borderlands 3 matched settings, PS5 slightly loses to a GTX1070Ti

So yeah, it's about the same speed roughly as the flagship cards or a bit slower from 2 generations ago. Because they have zero RT support it's just a bit more fair to say OK the PS5 is more like a slower mid range card from only 1 generation back, that also has RT support. Which is something like a 2060. A number of other places have made this kind of comparison. In reality the non RT workload performance of a 2060 is probably still quite a bit faster than the PS5

The idea the the consoles are high end gaming machines is like...hype and advertising.
 
Which is why I called them rough guides to performance, I'm not saying they translate perfectly into performance numbers because there's a huge number of other factors. But more importantly these differences in numbers are not at odds with what we see in reality when we see modern games run on these consoles. The new consoles are not good at high demand games, the AAA games that support "4k" often just use variable resolutions and downsample to 1440p or 1080p when they suffer frame loss, and to achieve 120hz in the few games that support it they often sacrifice even the most basic of scene detail.

If you look at in detail technical breakdowns of latest games on the consoles by a channel like DF you'll see games use variable resolutions and often drop down to quite low resolutions, that settings are dialed right back to achieve things like 120hz mode. That ray tracing to the extent that it exists has extremely aggressive optimizations above and beyond the PC ones to cull out a lot of additional detail. Here are some great in depth videos on this https://www.youtube.com/watch?v=hLUrgHWxWCU and https://www.youtube.com/watch?v=CF9A935XFkU

These machines while impressive leaps over their predecessors are not high end gaming platforms that you'd get from a high end PC dGPU setup. The actual levels of performance analysis of games shows performance clearly much closer to that of a 2060, if you're a gamer, you've used an array of hardware and know what kind of visuals and frame rate to expect from that hardware, and then look at what these consoles are capable of, then the claim that they're closer to a 2060 shouldn't be controversial at all.

GN channel has comparisons of the PS5 to PC builds on a variety of games here https://www.youtube.com/watch?v=HCvE4JGJujk
-DMC5 is a clear win for the PC on a GTX1060
-PS5 slightly beats out a GTX1080 in DIRT 5
-Borderlands 3 matched settings, PS5 slightly loses to a GTX1070Ti

So yeah, it's about the same speed roughly as the flagship cards or a bit slower from 2 generations ago. Because they have zero RT support it's just a bit more fair to say OK the PS5 is more like a slower mid range card from only 1 generation back, that also has RT support. Which is something like a 2060. A number of other places have made this kind of comparison. In reality the non RT workload performance of a 2060 is probably still quite a bit faster than the PS5

The idea the the consoles are high end gaming machines is like...hype and advertising.


One page later and people bringing this again? lol,

1060 can not run Borderlands anywhere near 4k like ps5 does,
Devil may cry 5 does not run in 1080p, runs at higher resolution even on the 120z mode in the ps5 making the comparison skewed for obvious reasons,.besides it has Ray tracing options and 4k option which well, 1060 cant run.
Dirt 5 has no RT options on the 1080, besides ,ironically enough the game for some odd reason has been losing performance on every console update, game is a janky mess to be honest.
Funny how he left games like COD out where it almost matches a 3070 lol.


I couldnt care less about console vs pc wars etc (i own both a ps5 and a high end gaming pc) but at least check your facts straight. cant believe people genuinely think a 1060 or a 1070 match any of the new consoles, certain youtubers should be ashamed of the BS they spout honestly making people believe that a 1060 will match a console lol.

Back to the topic, hows people experiences been on resident evil 8 with a 3080? heard its really close to using 10gbs vram, can see plenty of folks trying to upgrade for a 3080 TI very soon.
 
Status
Not open for further replies.
Back
Top Bottom