Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
You literally said your GTX1080TI couldn't use 11GB of VRAM.
If a GTX1080 at under 100% utilisation can use more than 8GB of VRAM at qHD,why are you so adament your GTX1080TI has too much VRAM??
I honestly don't think my 1080Ti can actually *use* all 11gb of RAM.
Do you think every GPU is capable of using an unlimited amount of vram?
If not, how do you calculate how much vram a given GPU is capable of using?
How?
Nobody has said that a 8800 GT should have come with 20 GB VRAM. Or a 3DFX Voodoo2.
I believe we were talking about the 3070, 3080, 1080Ti, 2080Ti, etc. The latter two actually coming with 11 GB VRAM, which you said they "couldn't possibly" use.
These card might be $1400 but have use forgot the OCUK price on top of the $1400 will be more like £2000 for the 3090 not a chance. It just works lol.
OK. If preorders are live tomorrow, who is actually buying a 3090 FE and what's the most you'll pay?
I will be at a maximum of £1399.
Will it match 2080ti performance though? It's rumoured to have 25% less shaders and only 230w I wouldn't be supprised if it's closer to a 2080super in performance than a 2080ti granted the RT performance should be better though.The 3070 is supposed to be 2080 Ti performance yet has 3GB less VRAM.
You can't calculate an answer to a question that doesn't makes sense.Do you think every GPU is capable of using an unlimited amount of vram?
If not, how do you calculate how much vram a given GPU is capable of using?
How?
No chance.. Will be more like £1600 to £1800.. at least.
Okay. At least now we are getting to my point.
How do we calculate how much vram a given GPU can actually make use of?
Anyone care to hazard even a guess? No one has answered what I thought was a rather simple question.
Not a bad read but seems that 4-8Gb is OK, I thought we would have pushed beyond this a while back.
Stop trolling. You ended up buying the GPU with the second most amount of VRAM of the Turing GPU,and yet if you truly believed what you said,you could have sold your GTX1080TI,and got a RTX2070 Super which is now slightly quicker.
You are now deflecting form what you said. Either read what I said,or I am going to ignore you.
The point about the 2080Ti stands. The 3070 is supposed to be 2080 Ti performance yet has 3GB less VRAM.
Sure it's cheaper, but your argument is that VRAM is paired scientifically to GPU grunt and nVidia don't give more than you need.
Doesn't mean it actually needs it. Can't say it chocks on only 8GB on a rtx 2080. Fine tuning the settings for better performance also drops vRAM usage. I think Manhattan area is the worse, but at this point I wouldn't look at MSFS 2020 as a landmark in optimization.
My point is if my 1080Ti can actually use 11gb of vram, (I have never seen mine use even 8gb and it is probably going to be retired before ever actually making use of even 8gb of its vram)...if that is so, why then not demand 24gb of vram?
How do you calculate how much vram a given GPU can make use of?
It's like demanding that every sports car ever sold come with massive carbon-ceramic brakes because they *need* them....And then running said sports car in a 24hr endurance race with regular brakes to demonstrated that sports cars needs to be able to run under conditions that almost no one who buys the car will ever encounter.
You can't calculate an answer to a question that doesn't makes sense.
Well, still on my i5 3570k and recently bought another 16GB DDR3.I bought 32gb of RAM when I built my old 3770k rig. I just retired that rig last year and never got anywhere near using even half of the RAM. (Although I'm sure people could have quoted examples where some computer...somewhere used 32gb of RAM.)
I don't know how much vram a given given graphics card can actually make use of. I don't even know how to calculate such a metric. I do know that I would not spend extra money for, say, 32gb of vram on any GPU today.