Clueless!
Keep on digging, you only embarrass yourself
A playstation 5 will be better at 8k than a 3080
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Clueless!
Keep on digging, you only embarrass yourself
More rubbish.A playstation 5 will be better at 8k than a 3080
More rubbish.
The fact you are willing to stoop so low to try and make your point says a lot. Love it, you keep proving me right. Using 8K which no one here uses to make a point. Hahahahahaha. Clueless!
All I'm doing is posting info relevant for people when making a purchasing decision. That you see something more than that is your problem. Back on the ignore list you go.Told you both Grim5 and Poneros will be on a crusade scouring the web for any tiny bit of info to suppport their viewpoint that 10gb is not enough and spread the word. They proved me right within hours
+1
Yeah yeah... Won’t fool me with that line crusaderAll I'm doing is posting info relevant for people when making a purchasing decision. That you see something more than that is your problem. Back on the ignore list you go.
24 GB is sometimes better than 10 GB
While the test of the GeForce RTX 3080 assumed that 10 GB is always sufficient compared to 8 GB in the course, the GeForce RTX 3090 shows in individual cases that this is not the case with the percentile FPS.
The plus in Ghost Recon Breakpoint is surprisingly large, where the GeForce RTX 3090 is clearly 22 percent ahead of the GeForce RTX 3080. In the end, this game requires more than 10 GB of memory in Ultra HD with maximum details, but it is the only title on the course that is so responsive.
This is why I am laughing my head of, these two are so desperate to make the point that 10gb is not enough that they submit invalid examples. Grim is now trying to use 8K to make his point. Sad. Lol.The frame rate averages for even the 3090 are sub 60 in that example - not what i would call playable for a shooter so adding more VRAM would not save the 3080 there.
Desperation to make a point is why.Why the F is anyone talking about 8k... that's the power of mis-marketing for you!
Can’t argue with that.Lets stick to the relevant discussion, which is VRAM usage while gaming at 4k. It would be wrong to say that 10GB is currently a limitation, but it would also be wrong to say that games definitely won't be exceeding that 1-2 years down the line, including any that develop a big modding scene and produce crazy texture packs. At the moment we do not know for certain, as we lack a crystal ball,whether 10GB VRAM will become a limitation within this GPU generational lifecycle, but I think it's fair to say that we are at the point where those limitations can be pushed with specific games. If additional VRAM currently allows it to be used as cache to deliver textures more quickly to games then this is also a benefit, though whether you are willing to pay the extra only to realise those benefits until later down the line, is another matter.
This is where our views differ. Assuming one will be selling to upgrade to Hopper and the 20gb variant adding an extra £200 on average to the price, do you really think you will be able to sell a 20gb 3080 for £200 more or even £150 more around the time hopper is coming out? My experience from the last 2 decades say highly unlikelyMy personal view is that when buying an already expensive electronics product, its sometimes worth spending extra to maximise the longevity and add something to the re-sale value that offsets the additional investment. We have yet to see how much a 20GB 3080 will cost, but even if it's £900-£950 it will certainly be far better value than the 3090 while preparing you for the possibility that the VRAM may come in handy 12-24 months down the line.
Guilty as charged, I will sell my 3080 in 18-24 months time as soon as I get wind of another Jensen presentation coming upI think it depends what type of GPU buyer you are. Lots of people on here are flippers and will buy and then sell on willy nilly in short periods of time.
It's my speculative and subjective view but one that I base on my experience with electronics in general. People are often happy to pay more for the perception that something is more future-proof, whether that be additional storage, memory or whatever. I think there is a realistic chance that 20GB VRAM will come in useful during ownership of a 3080 and I would prefer to pay an extra 15%, on top of an already significant GPU investment, for that peace of mind. But hey, each to their own and I will be happy to re-visit this thread when we have more information or examples of where VRAM does or does not come in useful in future..This is where our views differ. Assuming one will be selling to upgrade to Hopper and the 20gb variant adding an extra £200 on average to the price, do you really think you will be able to sell a 20gb 3080 for £200 more or even £150 more around the time hopper is coming out? My experience from the last 2 decades say highly unlikely
Saying that though, the past few years, people have been happy to pay near full price for second hand gear, so it is possible. Just depends if nvidia do a mid gen refresh like they did with turding and how much that forces the prices down. Also on what AMD does. But generally speaking the higher the price, the bigger the depreciation.
I think it will end up being more than 15%, that's the problem. Will probably be closer to 30%.It's my speculative and subjective view but one that I base on my experience with electronics in general. People are often happy to pay more for the perception that something is more future-proof, whether that be additional storage, memory or whatever. I think there is a realistic chance that 20GB VRAM will come in useful during ownership of a 3080 and I would prefer to pay an extra 15%, on top of an already significant GPU investment, for that peace of mind. But hey, each to their own and I will be happy to re-visit this thread when we have more information or examples of where VRAM does or does not come in useful in future..
I think it depends what type of GPU buyer you are. Lots of people on here are flippers and will buy and then sell on willy nilly in short periods of time.
Fair enough but I can’t be bothered with that, so like to buy towards the top end of a good GPU generation and then keep it for 2-3 years.
As such, I’d like a 3080 and can afford one, but it being ‘only’ 10GB VRAM genuinely does irk me a bit. It’s skating on the edge right now for some stuff at 4K and in 1-2 years time I guarantee there will be more issues.
And I keep coming back to how, from a marketing perspective it’s a bit of a silly move to release a card with less Vram than the 2080Ti and even the 1080Ti I have now. Nvidia should maybe have designed the bus to allow for 12 or 16GB of normal GDDr6 which would have significantly less power draw too, and would keep costs down even more for this card that’s supposed to be in the ‘sweet spot’. And then they would have solved this VRAM amount niggle too...
I totally get your point about future profing, but bear in mind you keep comparing it to a Ti, but it isnt one.
I haven't yet seen anyone going to those lengths to say games are unplayable or won't start with 10GB, but if I did see anyone say that I would have course agree that it's silly. However, in principle I would also not want to buy a 10GB 3080 and not be playing it on max settings in all my games due to a VRAM limitation, I would want the card to be able to use it's full processing grunt all of the time without any limitations stopping it doing so, which is why I would personally plump the extra for the 20GB if I was going to keep a card for 24-36 months (which I often do).I think it will end up being more than 15%, that's the problem. Will probably be closer to 30%.
There is near zero doubt in my mind there will be at least a few examples of new games in 12-24 months needing more than 10gb at 4K to run maximum textures. That is not something that is in question by me personally. I just take issue with people making it sound like 10gb will not be enough to even launch the game and will mean all future games will be unplayable which is a load of ********. You just turn down texture setting one notch for the said few titles in 4K and upgrade to hopper not so long later
Yeah, each to their own. £649 is already a lot for a GPU imo and as mentioned the more you spend the chances are the more it depreciates which I am not a fan of as I like to recycle the same money for my next GPUI haven't yet seen anyone going to those lengths to say games are unplayable or won't start with 10GB, but if I did see anyone say that I would have course agree that it's silly. However, in principle I would also not want to buy a 10GB 3080 and not be playing it on max settings in all my games due to a VRAM limitation, I would want the card to be able to use it's full processing grunt all of the time without any limitations stopping it doing so, which is why I would personally plump the extra for the 20GB if I was going to keep a card for 24-36 months (which I often do).
However, it really depends on what AMD come up with... if they release a 6900XT with similar performance to a 3080 for 25% less cost then it will be extremely tempting unless Nvidia can respond with a 20GB card that is not too much more expensive.
I would want the card to be able to use it's full processing grunt all of the time without any limitations stopping it doing so