To be fair though 4K is a moving target. Fury X was also marketed as a 4K card no?
Problems are always inbound, either vram or rasta or rt performance.
Spot on.
Still waiting to hear what gpuerilla considers to be a "4k" card too
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
To be fair though 4K is a moving target. Fury X was also marketed as a 4K card no?
Problems are always inbound, either vram or rasta or rt performance.
My personal view point has not changed. I was fine with the 10gb on the 3080 as it played every game at the time just fine and since then we have barely a handful of games that need more and from those any given individual may want to play what, one? Big deal... When you look at the alternative which is had nvidia tried to put more vram at the time it would have made msrp higher.
The vram issue imo is only an issue for people who intend to keep the card for a very long time and those that are inflexible and find the thought of tweaking settings unimaginable. For those you can buy a 16gb or 24gb card though, but even with those there will be situations where you need to tweak settings. Like with an AMD card you will need to tweak RT settings for example.
You might be on to something there as from what I remember the game used less than 8GB of vram on my laptop as if there was some kind of limit imposed whereas on my desktop it used almost 10GB.
I still get stuttering on my desktop just not as bad.
You can find specs of ny desktop and laptop in my sig.
Well I suppose one way round it is to only ever buy gpus for short term at over half a grand a pop
My old 1070 would do Windows 10 at 4k. Therefore I consider it a 4K card!
Turnip gets it.
Others try to be clever instead.
I have never tried to be clever. This is true.
Yea, and it's mostly an issue for those trying to push the cards at 4K or for VR.
Games are going to get more demanding, especially with the new engines, so people playing at higher resolutions will need to scale back setting to find playable FPS, which funnily enough will reduce VRAM demand.
Going AMD RDNA2 is not an option for me, I like to run RT games at a playble FPS. DLSS can't be beat either. So 10GB I'm more than content with.
He definitely does. Others try to be clever instead. Jensen said it was a 4k card. You can only take a horse to water...
Turnip gets it.
Anyone who spent £650 on a 10gb 3080 is laughing, imagine the poor guys who will be soon paying £300 for a 4gb 6500XT.
No-one did, but it seems that now due to the 'new market situation' we may have to make our current cards last well into the next generation. DLSS will be a huge contributor to making that happen but it's possible that 8-10GB cards may suffer.I didn't buy my card expecting it to last forever.
I set my texture pool in ff7 remake to 1 gig on the debug console and was greeted with minecraft textures.
That's the thing, even at 4k, 10GB is still not an issue (and I have a 4k display), also, there are a couple of games where you can max out 10gb even at 1440p (it's not "2k"), and here we are again, back to the "it uses all my vram, therefore, you NEED more vram"..... have a look at games like resident evil village, horizon zero dawn, godfall to see how performance is in those games between a 3080 gb and 6800xt even though one card uses more vram... Also, still waiting for proof to see that the extra 2gb vram is benefitting the 3080 12gb over the 3080 10gb model (outside of the better overall specs)???
Think you may need some help buddy. Just read what you posted. You have just cemented the 3080 is a 4k card.. as in it was release to be able to play games at 4k which is what you do. Why is it that you latch onto these strange posts on specific words. If you only put in the effort on this instead of your counting months ability then you would be more credible! "Its been released for 1yr 9months"
You still did not answer my question which was - why did nvidia release the same card but decide to throw an extra bit of VRAM on it? Surely if its is pointless, waste of money, "doesn't need it".. then why not just stay at 10Gb?
As for why nvidia have released a 12gb model.... perhaps because they are a company who want to make as much money as possible? Shocker, I know. Also, nvidia like to saturate the market with a card for every performance and price sector, this is a pretty common business practice, not to mention it also means they dominate benchmark scoreboards, just look at the 3060 choice and 3070 choice incoming.
Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....)
Before launch and at the reveals, both AMD and nvidia demo'd games being played at 4k with charts on fps.
Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.
Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.
Amazes me how many people that go on about 4k but have never gamed at it.