• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

What that says to me is that if Nvidia use the words "10GB was deemed enough for cost-efficiency reasons" then (as many of us suspected) they did knowingly compromise on the VRAM amount. That sentence with the word "deemed" tells me all I need to know, as it is a word is very often used when someone is uncertain/hesitant and wants to justify something in a more roundabout and less definite way while still staying formal and giving a direct answer. The people earlier in the thread who were saying that "VRAM will not be a problem and no-one knows better than Nvidia" are wrong. Nvidia know they compromised and we will undoubtedly see higher VRAM amounts as GDDR6X gets cheaper.

We'll see higher vram cards, but that's because people will buy them due to a lack of understanding, not because it's required.

Stop conflating game-indicated consumption with actual required levels, unless you have evidence of the impact, such like for like benchmarks that demonstrate vram requirements and bottlenecking with next-gen games?

By the time 10GB is a problem, RTX IO will negate the need for high levels of vram anyway.
 
So you think we will see a 3080Ti that was not launched witjh the 3080 and 3090 and that has not thus far been part of any leaks that didn't link it to the 3090... before Christmas? Really? :confused:

if nvidia delivers then yes. It has happened before.
 
Well if the video from DF is right from BL3 which does not use RTX then there has been a large improvement but from what I think it might be only in DX12 or Vulcan games maybe they did some magic there, I bet any old DX11 wont see as much of a jump

I also bet AMD's next card is about around 2080ti levels

so why does nvidia use a chart which shows only 1% on average gain by a 3070 over a 2080ti? Clearly in some games I am expecting a 20% gain over it.
 
We'll see higher vram cards, but that's because people will buy them due to a lack of understanding, not because it's required.

Stop conflating game-indicated consumption with actual required levels, unless you have any like for like benchmarks that demonstrate vram requirements and bottlenecking with next-gen games?

By the time 10GB is a problem, RTX IO will negate the need for high levels of vram anyway.
You are of course entitled to your opinion and whetever you want to believe, but I think many will disagree, especially as we have already seen games using up to or even more than 10GB VRAM. Your statement about RTX IO is also pure unsubstantiated speculation.

if nvidia delivers then yes. It has happened before.
If Nvidia deliver what? Do you mean if AMD delivers?
 
so why does nvidia use a chart which shows only 1% on average gain by a 3070 over a 2080ti? Clearly in some games I am expecting a 20% gain over it.

If using RTX then yes most likely but I think your being greedy :) 3070 doing a 2080ti performance and much cheaper and you want more ;)
 
You are of course entitled to your opinion and whetever you want to believe, but I think many will disagree, especially as we have already seen games using up to or even more than 10GB VRAM. Your statement about RTX IO is also pure unsubstantiated speculation.

But to reiterate, a game indicating that it's using more than 10GB of VRAM does not necessarily mean that the frame rates would have been any lower if it only had access to 10GB, or less. A game may fill VRAM if it's available, but this does not mean it needed to fill it.

I've not seen any benchmarks comparing the 3080 to the 3090 yet, so if the difference is marginal at 4K ultra that might give us a good idea of how much it matters.

Regards IO, since it was called out specifically as a feature in the reveal presentation, it's not a huge leap to assume that it's useful for something, neither is it a huge leap to assume that this it's nVidia's answer to the console SSD direct stream tech, but yes I suppose it's conjecture at this stage to say that it will offset vram requirements.
 
How can you be "reading conflicting reports" when Nvidia clearly showed and described what ports they have in the reveal video? Just watch it and read the official baseline specs on the sites.

The reason I've read conflicting reports, is quite simply because two sites I frequent gave me conflicting reports.

For instance:

https://www.windowscentral.com/nvid...tx-3080-rtx-3070-pricing-details-availability

Windows Central said:
You're getting an insane 24GB of GDDR6X video memory over a 384-bit wide memory interface, 5248 CUDA cores, HDMI 2.1 and three DisplayPort 1.4 ports.

Conflicts with:

https://www.digitaltrends.com/computing/nvidia-rtx-3090-vs-rtx-3080/

Digitaltrends said:
Although 4K is the focus right now, both cards support 8K with three HDMI 2.1 ports and a lone DisplayPort 1.4a connection

I've never watched a reveal video, I don't keep up with that sort of thing in the way that perhaps some of you do; so I wouldn't have known that they show anything beyond a few marketing focused performance graphs; but regardless, I didn't think that asking here was at all unreasonable, especially given that we don't have the cards in reviewers hands yet.

But anyway...

every sin card I have seen so far has gone with 3 x DisplayPort 1.4a and 1 x HDMI 2.1. Other variations may come later but that seems to be the norm

Thanks Greebo, it looks to me as though the article which gave me cause for concern was probably a type-o.
 
Last edited:
I don’t think even if amd did match the 3080 3090 I think I’ll stick with nvidia hopefully if the do match performance the price should be that wee bit cheaper .
 
So apparently CyberPunk will use RTX IO.
That is good to know, probably not on release though.


:D glad to see your sense of humour is back!
He was being serious. Lol.


If you can afford a 3080 fair enough, its a shame the 3070 doesnt have more vram otherwise I think more would have settled on that.

Same here, but let’s wait and see reviews.


Half the forum was saying this before Ampere launched while the rest of us were saying that Turing was an anomaly, and normal generational progress would have put the 20809ti's performance at the $700 price point. Now here we are. The $700 price point is now faster than it has ever been by sizable margin....I would argue two generations of progress have occurred at the $500 and $700 price point since Pascal.

If the benchmarks hold up, this is what generational progress looks like.
Yep. I actually think was like a handful of us that were saying the price was not going to be as bad. I said on many occasions the 3070 will likely not be more than £500. My final prediction before launch was £450-£500. Some pretend I did not say this though, not hard to use the search function ;)

Ah what the hell, we all know ALXAndy does not know how to use search so I will do it for him:

The amount of people that say prices will go up is crazy. Music to Jensen’s ears as it suggests they don’t mind.

Prices will not be going up in my opinion. 3080Ti will be cheaper and anything below that will probably remain similar. I can see the 3070 landing at £450-£500 and offering 2080 Ti performance but with better RT. That is the card I will get.

That was all the way back in January by the way.


I expect 8/10gb of vram to be enough for the next couple of years, but after that I can see it struggling. So I guess it depends how long you keep your cards. I tend to keep them 3 or 4 years, so I am more than a little worried.

They've been deliberately thrifty with the vram so they can sell more cards in two years time if you ask me.
Yeah same. It is a little close for comfort, but at least for those who want to there will be choice to pay for the extra vram.


You are of course entitled to your opinion and whetever you want to believe, but I think many will disagree, especially as we have already seen games using up to or even more than 10GB VRAM. Your statement about RTX IO is also pure unsubstantiated speculation.
But dude, it is well known that what is shown as used is not necessarily what is needed. I remember when Kaapstad had the first 12gb GPU and I think it was a COD game that gobbled it all up, did not stop 4gb cards working just fine with the game. I had a 12gb Titan too, many games used 10-12gb, but those games run just fine at 8gb on other people’s cards.

Now I am not happy with the 3070 and 3080 having 8 and 10gb. But I understand why Nvidia did it. I also think in 99.9% of games it will be fine and in the small amount of cases it is not, will just have to lower the texture detail one notch, job done.

If I go for a 3080 I will likely keep it until a few months before the release of the 4000 series which will likely be in 24 months time, so I can’t see 10gb being a major issue in that time frame, worst case equals drop down one notch from max for a handful of games for a short while until the next gen arrives.

I am not trying to defend having less vram, I wanted to see minimum 16gb. But at least we have a choice now. I will wait for reviews, but if my choice is pay £550 and get 3070 16gb and £650 and go for the 10gb faster 3080, I will likely opt for the latter. If they can hit £500 on a 16gb 3070 then I will get that.
 
I'm hoping that the 3090 can sit on top of 2 riser cables.
My Matx mobo already has a 10GB lan card and a sata card.

The only way to fit the 3090 is the put the lan\sata cards on risers :(
 
3080 doesn't have memory on the back of the pcb only the 3090 does
Great AIO SHOULD be ok then. My current 1080ti has the memory chips near the GPU Processor and the only air it gets is the blow off from the asside fan cooling the VRM.
Its always been that layout and seems to work well. Guess I must keep my hands off an FE card :)
 
If Vram progress slows down, game developers will just slow down their progress. Lower quality textures etc than they would if we all had 16gb. Developers have always targeted to realistic pc specs. It would be easy for a developer to create a game that requires unrealistic hardware. They don't because nobody could run it.

If 4gb Vram suddenly became standard again due to shortages, I guarantee 100% that developers will make sure that all future games they make don't need more than 4gb.

Everything always adjusts.
 
I am confused whether to get an AIB or the FE for the RTX 3080
I am strongly considering going for an FE, the cooler looks like a thing of beauty and seems very well designed and engineered. The only drawback is then you limit the VRAM to 10GB vs 3rd parties that may double it.
 
But to reiterate, a game indicating that it's using more than 10GB of VRAM does not necessarily mean that the frame rates would have been any lower if it only had access to 10GB, or less. A game may fill VRAM if it's available, but this does not mean it needed to fill it.

I've not seen any benchmarks comparing the 3080 to the 3090 yet, so if the difference is marginal at 4K ultra that might give us a good idea of how much it matters.

Regards IO, since it was called out specifically as a feature in the reveal presentation, it's not a huge leap to assume that it's useful for something, neither is it a huge leap to assume that this it's nVidia's answer to the console SSD direct stream tech, but yes I suppose it's conjecture at this stage to say that it will offset vram requirements.

If we are NOW at or over the limit of 10GB in complex games, then I fail to see how, logically speaking, 10GB VRAM will be enough for the next 2-3 years, assuming that games graphical fidelity and engines only going to get more graphically complex and demanding. PC GPU history has shown that VRAM requirements in games are only increasing (however gradually) over time.
 
NVIDIA are going to release Ti / Super variants, it's just a case of when they decide to.
For me it's like can my 1080ti last a little longer till those are released, as those are the cards to get with more VRAM. :-/

Either that or you say 'screw it, I'm getting a 3090' lol.
 
You are of course entitled to your opinion and whetever you want to believe, but I think many will disagree, especially as we have already seen games using up to or even more than 10GB VRAM. Your statement about RTX IO is also pure unsubstantiated speculation.


If Nvidia deliver what? Do you mean if AMD delivers?

OOps, half asleep. I did mean AMD
 
If we are NOW at or over the limit of 10GB in complex games, then I fail to see how, logically speaking, 10GB VRAM will be enough for the next 2-3 years, assuming that games graphical fidelity and engines only going to get more graphically complex and demanding. PC GPU history has shown that VRAM requirements in games are only increasing (however gradually) over time.

That depends on how Nvidia IO works out I would think too. But I still feel like you are missing what is being said in that games only need say 8GB but where a card has 12-16GB it is utilising to cache stuff because it can. That shouldn't be an issue for most part and with that as I briefly said the Nvidia IO should mean that games which utilise it can just swap out elements on the fly fast enough to have no issue just like consoles will be doing.
 
Back
Top Bottom