Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yeah you did, you spent all day making out black was white, or green in this case. Me grow up? maybe it's time you grew out of being lovesick for a corp that sells you GPU's.
but apparently i believe it is?I never said 8gb was enough,
but apparently i think it is?Sorry, where did i say it was ok for nvidia to lie? I didn't.
but apparently i'm defending them?james.miller said:NVIDIA GOT WHAT THEY DESERVED.
but apparently i don't think nvidia lied?they misadvertised the number of rops and size of the l2 cache. There's no getting away from that.
Yeah but it should be fine I hope.Shame about the G-Sync only monitor.
3080 has better grunt even for 1440p, for high refresh I don't see why you'd get a 3070 when a 3080 isn't a huge amount more.This 1440P and 1080P use midrange cards
Yeah, these are £200 cheaper than a 3080 and maybe the chance to get one on Thursday rather than waiting till next year for a 3080.Whilst I'll agree the 3080 has more grunt not everyone can afford the difference in price so for those who can't the 3070 will do them just fine oh I'm not a Nvidia fan boy I've had amd cards as well as nvidia
It's £500+ for 8GB of vram, that is not acceptable in 2020 when the ps5 has 16GB of the same memory for £450. Come 2021 and console ports to PC, that 8GB is gonna choke like ******. My suggestion is to go RDNA2 if you want to survive the next few years with the latest games.
Lol thing is I go with what best suits me at the time don't see the point in been loyal to either because they ain't loyal to me they don't say you've had a Nvidia card for so long you can have a discount so I go with what I feel suits my needs at the time and I won't be buying at these inflated prices I shall wait till they come downYou are a brave man for such honesty.. one does not simply, mention AMD cards in an nvidia thread.
Really, is that what people are saying?
Most of us don't buy 4K monitors either.Nope, but what they are saying is "it's enough". Totally forgetting that most of us don't buy a card every time Nvidia farts.
It's not enough. Come back in a year and tell me it was enough.
Nope, but what they are saying is "it's enough". Totally forgetting that most of us don't buy a card every time Nvidia farts.
It's not enough. Come back in a year and tell me it was enough.
It would be easier if we treat the games we have access to now, and the games that might arrive in the future, as two separate things. Ie, is 8gb of VRAM enough right now, and will it be enough in the future?
Now, Dave2150 took issue with doom eternal, suggesting that people are dismissing it and apparently saying 'nobody plays it'. Nobody to my knowledge has said this. Nobody could logically dismiss the game. What they (myself included) are doing is sorting the facts from the bull. We arent dismissing the game, we're saying that whole texture pool setting is a load of overblow nonsense. and it is. And the 3070 benchmarks PROVE that the game runs just fine maxed out with an 8gb buffer, which should put an end to people posting charts of the vram allocation on a 2080ti exceeding 8gb. But i guess it's gonna take some people a few days to catch up....
Then there's future games, and the answer is who knows? Certainly, i've never said 8gb is enough for games that'll come in the future because i dont know. I can speculate; the consoles dont have 8gb of vram, let alone 10. direct storage will be a thing, which means futures games should start leaning on texture streaming more and less on trying to keep as much in vram as possible. Could vram requires actually decrease? it's possible. I don't know for sure though. What i do know, is that I have never said 8 or 10gb will definitely be 'enough'. enough is such a vague word that i wouldn't bother anyway. But you watch people continue to accuse me of saying otherwise.
Bandwidth. 448Gb/sec vs 760Gb/sec. There's a massive difference in bandwidth between the 3070 and 3080. That's the reason the performance drops off even more against the 3080 than it does the 2080ti. It's not the size of the buffer, it's the width of the bus.So far we have established that the 3070 can be up to 38% slower than the 3080. We've also discovered that it's only 27% slower when not hampered by VRAM at 4k.
Three games have been shown to USE (remember I said use, not cache or pool or anything else) more than that, and pretty much the entire tech press have voiced their concern.
And that still isn't enough for people? We even had someone here think that what Nvidia said was gospel, you couldn't make it up.
You can't tell him, he doesn't care.
He will just bang on and on and on about how it's perfectly fine.
This is exactly what I predicted yesterday. Nvidia wanted to focus on 1440p, because they knew it matches a 2080Ti which is what they had said in their marketing. But this guy knows more than Nvidia. Guys like that? don't waste your time on.
Latest rumours point towards a 3070ti with 10gb of g6x and 3080ti with 12gb of g6x and increased cuda core count.I suspect its a bit of everything coming into play TBF. It will be interesting to see if Nvidia launches a 16GB variant of the GA104 or one with faster memory modules - it would definitely give us some more insights into where the bottlenecks are!