• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
At first glance I thought I agree with you as its sharper, but the high picture has the same issue I posted a couple of days back where the reflection is overpowering and looks like it is extending below the floor. Are both of these RT enabled? Is there a screenshot of that scene without RT?

This game just seems badly made to me.

Not a screenshot nor exact same angle:

nLXh4eq.jpg

SSR can only do so much, in some parts it looks good but other parts, it is awful e.g. SSR artefacts and reflections disappearing with even the slightest angle adjustment


Vv0x6Poh.jpg

RT reflections are broken and glitchy and break the immersion even more than SSR in this game, hopefully RT improves though as game is in desperate need of "good" RT
 
Not a screenshot nor exact same angle:

nLXh4eq.jpg

SSR can only do so much, in some parts it looks good but other parts, it is awful e.g. SSR artefacts and reflections disappearing with even the slightest angle adjustment


Vv0x6Poh.jpg

RT reflections are broken and glitchy and break the immersion even more than SSR in this game, hopefully RT improves though as game is in desperate need of "good" RT
What have you got your in game hdr settings if you don't mind please
 
What have you got your in game hdr settings if you don't mind please
peak/white brightness set to 932, can't set exactly 1000 to match my hdr1000 monitor though, should match whatever your displays peak brightness is
black point = 0
hdr colour = 25
ui = lowest value

It's more saturated/vibrant in real life than photos.
 
The bus width was different, cause as ive said you can't just add 2gb of ram. The 3080 10gb couldnt have 12gb,it would need to have 20. Or increase the bus which further drives up the cost.

How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb. For which they charged an extra £100. And considering how utterly greedy they are?. Put it this way, if you spent the extra £100 you did the right thing.

They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.

As bitter a pill as it may be to swallow this is completely their idea and how they limit cards from lasting people too long. They can't bake in a self destruct module as they would probably get caught. They can't stop making drivers for it too soon, because again people would notice. So they prey on people's behaviors and this is the safest and cheapest way to make sure they keep coming back for more.

The 20 series was pretty much shunned by reviewers. The prices were insane, and any one with a 1080 or 1080Ti had all they needed. So, they made sure with the 30 series that people would come back for more once they released the 40 series. And oh look, it's worked. Who'da thunk it eh?

Also dude, unless you are Jen's accountant you really have no idea how much it costs to increase the bus. I have a pretty good idea of how much VRAM costs 'cause you can find out if you look. But believe me when I say it is nowhere near as precious or expensive as Nvidia want you to believe. Like I said, with things being a bit bland and at a stand still in a gaming sense Nvidia need a way to keep you buying. 1070 - 8gb. 1080 - 8gb. 2070 8gb. 2080 8gb. 3070 8gb. So for three generations they did not change it, even though they knew for a fact RT would guzzle VRAM.
 
They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.
Pretty sure every reviewer was raving about the 3080 at launch as it crushed the 2080ti for almost half the money while being around 70-80% faster than the 2080, its probably the best high end card Nvidia have released in the last 5 years in terms of price to performance.

The 12gb model was released so Nvidia could charge AiBs a lot more cash and take a bigger share of the mining windfall.

As I said earlier, how many would have prefered paying for a 3080 built on a GA104 die with a 256 bus and 16gb VRAM over a GA102, 320 bus and 10gb? Well thats the route Nvidia have now gone down with the 4080 and look how **** that is for the money.
 
How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb. For which they charged an extra £100. And considering how utterly greedy they are?. Put it this way, if you spent the extra £100 you did the right thing.

They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.

As bitter a pill as it may be to swallow this is completely their idea and how they limit cards from lasting people too long. They can't bake in a self destruct module as they would probably get caught. They can't stop making drivers for it too soon, because again people would notice. So they prey on people's behaviors and this is the safest and cheapest way to make sure they keep coming back for more.

The 20 series was pretty much shunned by reviewers. The prices were insane, and any one with a 1080 or 1080Ti had all they needed. So, they made sure with the 30 series that people would come back for more once they released the 40 series. And oh look, it's worked. Who'da thunk it eh?

Also dude, unless you are Jen's accountant you really have no idea how much it costs to increase the bus. I have a pretty good idea of how much VRAM costs 'cause you can find out if you look. But believe me when I say it is nowhere near as precious or expensive as Nvidia want you to believe. Like I said, with things being a bit bland and at a stand still in a gaming sense Nvidia need a way to keep you buying. 1070 - 8gb. 1080 - 8gb. 2070 8gb. 2080 8gb. 3070 8gb. So for three generations they did not change it, even though they knew for a fact RT would guzzle VRAM.

The 3080 12GB didn't even come with an msrp from Nvidia..

The 3080 12GB was being sold from the start for £999 or $999 and up and the 3080ti was $1200 msrp for the FE .
 
How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb. For which they charged an extra £100. And considering how utterly greedy they are?. Put it this way, if you spent the extra £100 you did the right thing.

They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.

As bitter a pill as it may be to swallow this is completely their idea and how they limit cards from lasting people too long. They can't bake in a self destruct module as they would probably get caught. They can't stop making drivers for it too soon, because again people would notice. So they prey on people's behaviors and this is the safest and cheapest way to make sure they keep coming back for more.

The 20 series was pretty much shunned by reviewers. The prices were insane, and any one with a 1080 or 1080Ti had all they needed. So, they made sure with the 30
I never said you cant increase the bus width as well.. My point is shipping the card with 12 gb ram instead of 10 isnt just costing a single 2gb ram module. Its more complicated and costly.

The 10gb card is absolutely great and its going to be relevant for as long as its competitor is with 16gb of ram.

You are saying rt needs vram, but RT also needs.... Rt performance. Amd shipped the first, nvidia the latter. Both cards will go to the grave hand in hand, but the 3080 will offer a better experience until then in 99% of games.
 
The 3080 12GB didn't even come with an msrp from Nvidia..

The 3080 12GB was being sold from the start for £999 or $999 and up and the 3080ti was $1200 msrp for the FE .

Ah I see. So for the extra 2gb they wanted even more. Makes sense lol.

I'm just going on what my mate paid. He got a Strix for £720, and at the time the 10gb was going for about £620. But yeah, thinking back it didn't sell well because it was priced too close to the Ti.
 
How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb
two choices. Split memory bus, and look how well that went down with the 970, or More memory = bigger bus. bigger bus = more memory controllers on the die. All of this = more cost. The 3080 was aggressively priced - it wouldnt have been if it was released as a 12gb card.
 
Back in the early days, Intel started selling GPUs that used your system ram as vram.

Imagine if we had that now, and people could give their GPUs as much vram as they want and we wouldn't have to moan all the time
 
Like I said, with things being a bit bland and at a stand still in a gaming sense Nvidia need a way to keep you buying. 1070 - 8gb. 1080 - 8gb. 2070 8gb. 2080 8gb. 3070 8gb. So for three generations they did not change it, even though they knew for a fact RT would guzzle VRAM.
While the 3070 could have done better with 10gb, the 2080 and older are fine 99,99% with 8gb. Even the almighty consoles with their 16gb (actually less than that since the OS will need its own RAM) will fall flat due to lack of power, not vRAM.
 
Last edited:
peak/white brightness set to 932, can't set exactly 1000 to match my hdr1000 monitor though, should match whatever your displays peak brightness is
black point = 0
hdr colour = 25
ui = lowest value

It's more saturated/vibrant in real life than photos.

You can change it to 1000 in an ini file in appdata somewhere.
 
Back in the early days, Intel started selling GPUs that used your system ram as vram.

Imagine if we had that now, and people could give their GPUs as much vram as they want and we wouldn't have to moan all the time

They do that now. Unfortunately DDR is nowhere near as fast as GDDR and it utterly tanks your performance.

It used to be worse than that too. It used to use the paging file on your hard drive. It was atrocious on an SSD, and a million times worse on a spinning drive. It would literally freeze your system.

Nvidia used to call it, and I quote "Texture Streaming" and it became an issue very quickly. My first experience of it was with my GTX 470 in BF3. The mall sniper level where you had to protect the hostage was practically impossible. For the life of me I could not figure out what was going on. Like, back then my reactions and eyesight were far better than now, yet I kept getting rushed as there was input lag when I shot. Changed it to a 6970? breezed through it in seconds.

There have been numerous cases of GPUs not having enough at launch. Case in point? my Fury X. I can't remember exactly what COD game it was now as it was many years ago, but the game kept crashing. Like literally running at about 1 FPS, then the PC would just completely freeze. They patched it, but I immediately noticed that settings were completely missing that were there before. I just didn't understand it, so I Googled it. You could actually hack the settings back into place, but it was running out of VRAM. At which point? lol AMD had not even bothered to run any sort of texture streaming at all. They "fixed it" if you could call it that about a year later. By copying Nvidia. Unfortunately what you ended up with was an 8 FPS mess.

But do you remember what every one said when the Fury X came out? "Oh no no, 4gb is more than enough because it's HBM !" and all sorts of other pony that AMD wanted people to say. Fact is, sure as eggs, even if you run out on a 3070 you're screwed. Or any other GPU for that matter.

Here, some proof.

TA9Xsm4.png
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom