• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

Yes.. Its a 4gb card. Fact. The bits they misadvertised - nothing to do with the vram. Fact. I guess it's easier for you to believe I'm defending nvidia than listen to reason. Nvidia got pulled over the coals for the misadvertising and rightly so. But yeah, I'm defending them lol.
 
Yes.. Its a 4gb card. Fact. The bits they misadvertised - nothing to do with the vram. Fact. I guess it's easier for you to believe I'm defending nvidia than listen to reason. Nvidia got pulled over the coals for the misadvertising and rightly so. But yeah, I'm defending them lol.

You are defendiing them, that's obvious to anyone reading the thread. NV have a history of VRAM ripoff's, end of story.

:)
 
I don't know if 8gb or 10gb is enough but I can tell you that in 2020 my 4Gb geforce 980 isn't enough to game at 1440p. Most games I try these days seem to complain about it!
 
You have no idea what you are talking about. Let's not pretend anybody else with an ounce of common sense agrees with you.

I've stated the facts, while you have got butt hurt about NV getting caught out lying about their product, they lied about their card to the extent they had to settle a lawsuit, yet you jump to their defence. It seems to me I know exactly what I'm talking about and have stated the facts to prove it, while you waffle and spin for NV.

I think most people with any common sense understand why a company has to settle a lawsuit. The facts speak for themselves...

:D
 
I've stated the facts, while you have got butt hurt about NV getting caught out lying about their product, they lied about their card to the extent they had to settle a lawsuit, yet you jump to their defence. It seems to me I know exactly what I'm talking about and have stated the facts to prove it, while you waffle and spin for NV.

I think most people with any common sense understand why a company has to settle a lawsuit. The facts speak for themselves...

:D
Because they misadvertised the number of rops and size of the l2 cache. There's no getting away from that. Why also do you think no lawsuit existed outside the US? Gee why might that be I wonder... There's a reason why no lawsuit existed at all against the 660ti when that had a similar memory arrangement, and that's because nvidia didn't misadvertise some technical details which had nothing to do with the vram, unlike their 970 cockup.

This is boring. You made a point about the card not having 4gb of VRAM. It categorically does. You're wrong. That's all there is to it. stop trying to pretend I'm defending nVidia for something, as i've said nothing of the sort.
 
Last edited:

But the vRAM is not the same as the cost of the video card, it's just one component. There's many factors that go into cost, the speed of the memory, the bus width it operates across, how fast the GPU is, the quality of the components, the quality of the cooler, etc.

This is so dumb I'm just going to stop responding to this now. Too many insane comments about how vRAM = price, and vRAM = performance. If you don't get why this is wrong then you're just too far off being able to have a sensible conversation with regarding this.
 
Hard to believe there's anyone defending NV over the 970 debacle, let alone in 2020. Truly a first for me to witness, but at least it makes it clear who's discussing these issues in good faith or not.
 
I wouldn't be comfortable with 8gb vram at 1440+ especially going forward with new games, new consoles and AMD cards having a hefty amount more.

We've seen in the past how bad cards that just about have enough vram for today but not enough for tomorrow age.
This, it will be fine for 1080, borderline OK for 1440 and not enough for 4K. It's always better to have vram and not need it than the other way around and newer open World games are going to need plenty. If 8gb was good enough for 4 years ago then it isn't enough for now and certainly not going forward for anything other than 1080.
 
Eurgh The memory split on the 970 happened because of the way the cores were fused off. The only way around that would be to use cores without defects and guess what a fully functional core would have been called? Gm204-400...aka gtx980. If nvidia would have charged 980 prices for a 970 you'd have a point, but they didn't did they? The 970 was far from a ripoff. It was an exceptionally priced card. But let's not let facts get in the way of a good moan.
Wut? They called it a 4gb card ffs and lost a law suit and had to compensate tens of thousands (at least) of customers because of their dishonesty but that's somehow OK because if they'd been honest it would have been more expensive:confused:. Some people on this forum would argue white is black and up is down if Nvidia said it was so.
 
You have no idea what you are talking about. Let's not pretend anybody else with an ounce of common sense agrees with you.
I agree with him but I'm not defending deliberately deceitful behaviour from a US corporation in order to con it's consumers. Defending that as you are is simply indefensible and totally destroys any credibility you or others who argue similarly may or may not have. End of story.
 
https://www.polygon.com/2016/7/28/12315238/nvidia-gtx-970-lawsuit-settlement


Their complaints centered around three allegations:

The GTX 970 was marketed as a card with 4 GB of video memory, whereas the card actually has 3.5 GB of RAM, with the remaining 0.5 GB being a much slower "spillover segment" that is decoupled from the main RAM

The GTX 970 was marketed as a card with 64 render output processors, whereas the card actually has 56 render output processors

The GTX 970 was marketed as a card with an L2 cache capacity of 2,048 KB, whereas the card actually has a 1,792 KB L2 cache

The parties came up with the $30 figure by focusing on the main misrepresentation alleged by the plaintiffs: that they were cheated out of 0.5 GB out of a total of 4 GB of RAM (one-eighth, or 12.5 percent). Since the average retail price of a GTX 970 was approximately $350, the members of the settlement class could have expected to receive 12.5 percent of $350, or $43.75, as compensation — if the case had gone to trial and the plaintiffs had won. Thus, the proposed settlement argues, a $30 payout for each GTX 970 customer — nearly 70 percent of the $43.75 that they might have gotten — "constitutes an excellent recovery."
 
Wut? They called it a 4gb card ffs and lost a law suit and had to compensate tens of thousands (at least) of customers because of their dishonesty

The lawsuit wasnt just about the vram, when are people going to listen?

but that's somehow OK because if they'd been honest it would have been more expensive:confused:

I didn't say that at all. Nowhere did i say that being honest meant it would have been more expensive. Do me a favour; if you're going to try and pick my posts apart then at least read them properly?

gm204-400 (gtx980) was a full enabled die . gm204-200 (970) was a cut down die. nvidia hit the price they could because they used faulty 980 dies and fused off a part of the core. I forget the exact technical detail now, i cant be bothered to go look. You can do that. anyway, had nvidia been forced to use fully enabled dies and reject the fault ones entirely, what do you think would happen? less available dies = higher demand = more cost. less working dies per waffer = higher cost per die = MORE COST. do you think that cost would have been absorbed or passed on to the customer?

This is why the VRAM situation existed because nVidia wanted a 4gb card. Could they have released a 970 with 3.5gb of Vram instead? sure. But that would have had less useable memory. So what did nvidia do? they decided to put 4gb of ram on the card, of course. It wasnt perfect, it causes issues with SLi especially, and nvidia absolutely could have been more honest about that.

Am i defending nVidia's lack of transparency? Sorry, where did i say it was ok for nvidia to lie? I didn't. Same old story, knownothings confuse an explanation of why the 970 was designed the way it was with a defence of the way nVidia handled the fall out and ignore me saying over and over again that nVidia got what they deserved. NVIDIA GOT WHAT THEY DESERVED. <-- that's not defending them.

Some people on this forum would argue white is black and up is down if Nvidia said it was so.

Indeed. Like claiming the 970 doesnt have 4gb of ram. I tell you what, go read the lawsuit and then work out what 3.5+0.5 equals. Use a calculator if you have to.

I agree with him but I'm not defending deliberately deceitful behaviour from a US corporation in order to con it's consumers. Defending that as you are is simply indefensible and totally destroys any credibility you or others who argue similarly may or may not have. End of story.

Whereas you don't understand the architecture at all so you are clearly more 'in the know', right? move on.
 
Last edited:
RTX 3070 benchmarks from the reviewers guide

Its losing to the 2080ti in Control with rtx on... And losing to the 2080ti in BL3.. Its basically winning some synthetics and a couple of games like Minecraft rtx..

Underwhelming, the only redeeming feature would be the price at £500 your getting a slightly worse 2080ti, which probably wont OC very well and will likely lose more than win vs an OCd 2080ti.

The gouging on the 3070 will probably make it a bad buy, at £500 its a so so buy, above that its looking bleak
 
So it looks like 3070's 8gb vram is not optimal for 1440p/4k.

is 3080 10gb enough for 1440p then? I guess I will stretch my budget if needed, I would like it to last 3-4 years at high settings atleast, 1440@60+fps
 
is 3080 10gb enough for 1440p then? I guess I will stretch my budget if needed, I would like it to last 3-4 years at high settings atleast, 1440@60+fps
Some games will be pushing it now to keep over 60fps with high settings less alone in 3 or 4 years time :D
and specially now we have the next gen consoles coming out so games over the next few years may push gpu's much harder
 
Last edited:
Back
Top Bottom