Oh okayThe 3GB 1060 was never a good purchase, it was a slower GPU than the 6GB 1060.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Oh okayThe 3GB 1060 was never a good purchase, it was a slower GPU than the 6GB 1060.
I think given the choices you will be fine with 3080 10GB. If you go with AMD, they will have more vram but you'll give up other software benefits (like DLSS). If you go with 3070 16GB (when that releases) then you'll trade less performance for more vram. How it all shakes out is hard to tell but it will have to come down to subjective preference: if you're ok with turning texture quality down a step, then 3080 10GB is guaranteed to be the best choice. Otherwise you'll have to weigh texture quality vs more FPS. Tbh you can't go too wrong with any of the choices atm, though at the same time you can't even buy the cards so it's all a moot point. At 1440p though I would say you shouldn't worry about vram.So I am going to jump from an intel hd 5500 to a rtx 3080. Its going to be amazing, but do y'all think if im keeping the card for 4-5 years, 10gb will be good enough for high settings no AA at 1440p? I have no experience i have never owned a video card before. I will only buy the 20gb card if its only $100 more any more than that and it loses my money. Hope amd can compete im interested in seeing what rdna2 is going to do this time.
I am just worried 10gb might turn out to be like 1060 3gb which aged like milk...
So I am going to jump from an intel hd 5500 to a rtx 3080. Its going to be amazing, but do y'all think if im keeping the card for 4-5 years, 10gb will be good enough for high settings no AA at 1440p? I have no experience i have never owned a video card before. I will only buy the 20gb card if its only $100 more any more than that and it loses my money. Hope amd can compete im interested in seeing what rdna2 is going to do this time.
I am just worried 10gb might turn out to be like 1060 3gb which aged like milk...
Thanks a lot for the answer i really appreciate it!I think given the choices you will be fine with 3080 10GB. If you go with AMD, they will have more vram but you'll give up other software benefits (like DLSS). If you go with 3070 16GB (when that releases) then you'll trade less performance for more vram. How it all shakes out is hard to tell but it will have to come down to subjective preference: if you're ok with turning texture quality down a step, then 3080 10GB is guaranteed to be the best choice. Otherwise you'll have to weigh texture quality vs more FPS. Tbh you can't go too wrong with any of the choices atm, though at the same time you can't even buy the cards so it's all a moot point. At 1440p though I would say you shouldn't worry about vram.
Watch this comparison as an example of what you might have to turn down (more relevant for 4K):
AightAt 2k the 3080 will be fine for ages, I've been running thing at 2k on a 1080 with 8gb ram and haven't really hit any issues yet. at 4k you might have questions but at 2k nah you will be fine and tbh unless you want to run things are silly levels at 4k its probably going to be fine.
No problem. AMD-wise, we don't really know anything about something "DLSS-like", but we'll definitely get hardware accelerated raytracing (which who knows if it will be faster or slower than Nvidia's implementation), and direct storage is vendor agnostic and will also be present on AMD but will require that the developers implement it in their games in the first place. For the first few years I'd say it's not going to be relevant much at all until people start catching up & fully switching to the new consoles.Thanks a lot for the answer i really appreciate it!
Oh okay, I think i would like to get more fps and i would be fine going ultra->High if i ran into limitations 2 years or so down the line.
Wasn't amd planning on launching their own open source dlss,direct storage,and ray tracing?
Yea I'm hoping they come up with something DLSS-like otherwise it won't make sense to buy amd.No problem. AMD-wise, we don't really know anything about something "DLSS-like", but we'll definitely get hardware accelerated raytracing (which who knows if it will be faster or slower than Nvidia's implementation), and direct storage is vendor agnostic and will also be present on AMD but will require that the developers implement it in their games in the first place. For the first few years I'd say it's not going to be relevant much at all until people start catching up & fully switching to the new consoles.
My personal view is that when buying an already expensive electronics product, its sometimes worth spending extra to maximise the longevity and add something to the re-sale value that offsets the additional investment. We have yet to see how much a 20GB 3080 will cost, but even if it's £900-£950 it will certainly be far better value than the 3090 while preparing you for the possibility that the VRAM may come in handy 12-24 months down the line.
I'd agree with this IF it did actually future proof it more, but that's not typically the case. Future games have larger demands on vRAM for sure, but they also have greater demands on the GPU. What you'll find is that in 2 years if you're playing games and trying to max out settings in 4k, your GPU will be suffering trying to run the game. And when that happens people go straight to the video settings, they lower all the new effects which are dragging the frame rate down, and as you lower those settings it lowers demand on vRAM.
The usage of vRAM and the load on the GPU are disconnected in that way, you cannot just load more stuff into vRAM in future games and not have that stuff have an impact on the GPU and the frame rate it can put out.
I wish you would stop talking in definitives when what you are saying is actually speculative. You are making a reasoned assumption, an educated guess. You do not know beyond doubt that a 10GB 3080 will not run into a VRAM limitations that hold back the available GPU horsepower in future, especially when we have seen similar happen in past generations some years back. The journey from 3-4GB VRAM being an actual limitation to 10GB VRAM being a potential limitation has been a hell of a long one, but we are now at the point where it could happen within this generation.
Do you mean connected? We have seen in the past with older generations that video cards can be VRAM limited but not horsepower limited. It's not a given that if a card doesn't have enough VRAM to smoothly handle a game, that it doesn't have enough horsepower to run the game it needs additional VRAM for.
Anyway, 10GB is currently not a limitation, that's clear. However, even not knowing if VRAM will actually be a limitation within this generation I still would prefer to invest some extra (20% or so would to me be reasonable) in a 20GB 3080 (or a 16GB 3090XT if they cut the mustard as that should be cheaper than a 3080) to remove this as a potential concern on an expensive GPU purchase for all the reasons I stated in my above posts. I'm more than happy to agree to disagree on the rest and I guess we now just have to revisit this thread later when we know more.
Pretty much any modern AMD or Intel CPU with 16GB RAM and an SSD will do the job for powering a modern GPU, which most people already have in some form. Indeed I would say that the CPU hasn't really been a limiting factor since a good while. The GPU is the limiting factor in the vast majority of games at higher resolutions nowadays.I'm not against people wanting to not take a risk and pay for say an extra 6-10Gb of vRAM, I mean that's gonna cost you because it's not cheap. And ultimately that might have been a pointless upgrade, but thats your risk to take, it could go the other way and I could be wrong. Mostly what bugs me is that people discuss this topic one sided like 10Gb has been decided to not be enough, when so far the evidence kinda points the other way. I fundamentally see the argument that future games will demand more vRAM as a one sided take on the situation, they will also demand more from CPU/RAM/GPU/HDD/SSD etc, it's not that clear cut.
Actually you could. you where just running at a resolution were you would not see it.. i remeber. it made a difference on a 4k screen. i dont think it was ever ment for lower resoltuions though.Don't be silly, 20GB won't be enough. Just pay a little bit more again and get the 3090
I think it was Shadow of War that offered an optional high res texture pack, which I downloaded and tried at the time of release. I don't know what I expected, but you really couldn't tell the difference while playing. Complete waste of download time and storage space.
You can already get 8k textures for 4k games that increase VRAM requirements, usually user-created mods. 8k gaming isn't yet "a thing", Nvidia just shoehorned it in to their marketing. 8k is four times the number of pixels as 4k... and current gen GPU's can only just now run 4k reliably. 8k gaming is at least couple of generations away yet.I think because 8 k is now a thing..like how 4k was a thing when it first was all hyped up. means for ultra textures its going to be 8k isntead of 4k.. so thats where the 10gb will not be enough. but for just 4k it will.
You can already get 8k textures for 4k games that increase VRAM requirements, usually user-created mods. 8k gaming isn't yet "a thing", Nvidia just shoehorned it in to their marketing. 8k is four times the number of pixels as 4k... and current gen GPU's can only just now run 4k reliably. 8k gaming is at least couple of generations away yet.
It's possible, but it's not a thing. The minimum frame rate is also under 30fps. But sure, if you want to call it a thing then go ahead and call it a thing. *shrugs*it is a thing..you can run a game at 30fps at 8k...i remeber when 4k was a thing and it was the same back then. that was my point about 8k textures and ultra texture settings in future games..were talking about the actual game here..not mods.