Permabanned
10GB is fine though.....
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Seems so for the moment anyway.10GB is fine though.....
Seems so for the moment anyway.
1783 post thread TLDR:
Is 10GB GDDR6X enough for 4K ultra settings now? Yes.
How long until 10GB GDDR6X is not enough for 4K Ultra settings? Nobody knows. Some people say soon, others say it’s a while away.
What happens when 10GB GDDR6X isn’t enough for 4K Ultra settings? You will need to turn some settings down in that game.
More rubbish from you as usualIf you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.
It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!
The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap
The answer is we don't know.
Future games designed for 4K base will either require larger amounts of Vram or be heavily dependent on the ability to stream in very fast.
Time will tell which solution will be better in the short/long term but do not expect game developers to be "efficient" with their use of vram as for the most part when it comes to the PC version everything just gets dialled up with no optimisation.
I view VRAM as a kind of "insurance". More VRAM is worth more money to me, but probably not worth as much as Nvidia want to charge.
If they want $1k for 20gb of VRAM, I'll take my chances with 10 or just go AMD and get 16.
I realize that I am risking utter catastrophe. The day may come when one day, on some random new game, that I must turn down a setting that just ruins my entire experience. -And at that moment I will wish that I had spent an extra $300 so that I could use that setting on that game.
I'm scared just thinking about it. Someone hold me.
3080TI 20GB pretty much confirmed at this point! There's was a tweet linked to at Reddit, that's since removed. Hopefully on TSMC 7nm.
Going to be fun quoting a few select people here who were adamant that 10GB 3080 is enough for 4k for years, and who claimed Nvidia wouldn't release a 20GB 3080/3080ti variant.... Wake up, sheep!
This is a newer leak which came out today - sounds exciting. More ram + more coresWhat no? 3080 20gb and 3070 16gb were cancelled a long time ago.
3080ti 12gb g6x and 3070ti 10gb g6x are the newly leaked refreshes.
If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.
It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!
The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap
SAM basically gives the card direct access to main system memory, but only when configured in a system with one of the new Ryzen 5000 series CPUs. [11]
CAS is worthwhile, though it comes at the cost of increased aliasing and a drop in graphical clarity in many scenes.
Imagine you have a license plate of a car which is far away, the numbers and letters being barely readable. Now if you upscale it it will blur and still be unreadable and worse, its blurred. DLSS has seen thousands of license plates in the high res footage that it ingested during the training phase and knows what the license plate should look like. It then adds in the details that it learned. So if the digit 5 is indistinct, DLSS has seen that 5 before and adds in missing pixels to make the 5 sharp and full. So now we have a full res image and more detail than the rendered image. [14]
DLSS 2.0 still delivers the best picture quality - in most cases with even more detail than native resolution. And the combination with a temporary filter eliminates flicker. FidelityFX CAS remains an alternative for those users who cannot activate DLSS due to the lack of NVIDIA graphics cards, but still want to play in high definition. [15]
[3] https://www.overclock3d.net/reviews/gpu_displays/asus_rtx_3080_oc_strix_review/20The Radeon RX 6800 XT is also 29 percent slower than the GeForce RTX 3080 in that regard. “Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary.” NVIDIA GeForce RTX 3080 630 FPS
https://www.cnet.com/news/amd-radeon-rx-6800-xt-and-6900-gpus-target-4k-gaming-start-at-579/ https://www.amd.com/en/technologies/smart-access-memoryAMD claims that SAM combined with Rage Mode boosts frame rates by up to 13%
AMD claims that SAM combined with Rage Mode boosts frame rates by up to 13%, which means performance on Intel systems or upgraded systems won't be as good.
SAM basically gives the card direct access to main system memory, but only when configured in a system with one of the new Ryzen 5000 series CPUs.
[12] https://www.techspot.com/article/1992-nvidia-dlss-2020/ https://www.game-debate.com/news/29159/death-stranding-pc-full-dlss-2-0-performance-analysis-and-graphics-comparison https://www.youtube.com/watch?v=uMbIov1XQa8 https://www.nme.com/features/nvidias-dlss-2-0-tech-could-make-your-300-graphics-card-perform-like-a-700-one-2708925Up too 11% extra performance.
https://hexus.net/tech/news/graphic...deon-rx-6000-series-promises-rtx-performance/ https://hexus.net/media/uploaded/2020/10/6c66ed53-e318-4ec0-b077-fbcc78f14cee.PNGIn an apples-to-oranges comparison we don't like, AMD further says Radeon RX 6800 XT is faster than GeForce RTX 3080 with Rage Mode and SAM turned on. Well, Nvidia cards can be overclocked too, so this comparison is not entirely fair.
[14] https://www.roadtovr.com/nvidias-latest-dlss-2-1-supersampling-now-supports-vr/If you happen to have a Ryzen 5000 Series CPU paired alongside any Radeon RX 6000 Series GPU, a feature known as Smart Access Memory (SAM) enables the CPU to access the full complement of GPU memory, instead of just 256MB. It's said to offer an extra fps or two in complex scenes where the CPU historically bogs the GPU down.
If you buy the flagship just "for the moment" - then sure, 10GB is fine for 4K. Fast forward into next year, when next gem games release, I wouldn't be so sure. Consoles are getting a double memory upgrade (8GB total memory on current consoles, to 16GB on next gen. Note this is shared memory, though they are still getting double!) so it's only logical that VRAM requirements will drastically increase, if not double, for the next gen games that are incoming.
It's also a matter of principle - who wants to spend £800 on a GPU that's barely enough for 4K in current games? Shouldn't the flagship have enough for 2-4 years, as has been the norm for previous generations? It's got less VRAM than the ancient 1080ti!
The 3080ti will be a fantastic card, I hope Nvidia use their industry power/finances to rush it through asap
This thread has really gone round in circles, end of the day it's a judgement call
I always think there's an art to buying a PC component, there isn't one answer for everyone.
1783 post thread TLDR:
Is 10GB GDDR6X enough for 4K ultra settings now? Yes.
How long until 10GB GDDR6X is not enough for 4K Ultra settings? Nobody knows. Some people say soon, others say it’s a while away.
What happens when 10GB GDDR6X isn’t enough for 4K Ultra settings? You will need to turn some settings down in that game.
TLDR of #1784 - Baaah. People on youtube tell me 3080 is top dog for many years. I believe them, and I'm prepared to turn down settings on next gen games so my brand new £650-870 flagship GPU can cope. Baaah
3080TI 20GB pretty much confirmed at this point! There's was a tweet linked to at Reddit, that's since removed. Hopefully on TSMC 7nm.
Going to be fun quoting a few select people here who were adamant that 10GB 3080 is enough for 4k for years, and who claimed Nvidia wouldn't release a 20GB 3080/3080ti variant.... Wake up, sheep!