• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 8GB of Vram enough for the 3070

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,178
Location
Greater London
3070 will go down as short changed by NVIDIA again on VRAM.
Yeah probably. But I doubt it will be as bad as say Fury was. I am happy AMD went with 16gb. It is now certain that when I upgrade to either Hopper or Arcturus in a couple of years time it will be no less than 16gb. A RTX 4070 will likely offer 16gb and at least match 3090 performance would be my guess.

That said, we will likely see a 3070 with 16gb before long. Question is how much more will they charge for it.

Which card are you getting? :)
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
So every console is packing more than 8-10GB of VRAM, every AMD card has more than 8-10GB of VRAM...

But NVIDIA are telling us 8-10 is enough? ok...

Consoles will target a maximum of 10GB to be used for graphics as they only have 16GB total. Most titles will use ~6GB for graphics. I'm sure if you have been following this thread that you will also have learned about hardware streaming data directly to the GPU, reducing the need for more VRAM to keep things running smoothly.

I'm not sure what you meant to say, but AMD have 8GB cards.

AMD pulled a smart marketing decision by doubling up on cheap VRAM. A lot of people are falling for it. The same people who do not understand how VRAM is used.
 
Soldato
Joined
19 May 2012
Posts
3,633
Consoles will target a maximum of 10GB to be used for graphics as they only have 16GB total. Most titles will use ~6GB for graphics. I'm sure if you have been following this thread that you will also have learned about hardware streaming data directly to the GPU, reducing the need for more VRAM to keep things running smoothly.

I'm not sure what you meant to say, but AMD have 8GB cards.

AMD pulled a smart marketing decision by doubling up on cheap VRAM. A lot of people are falling for it. The same people who do not understand how VRAM is used.


So AMD are pulling smart marketting tricks with their VRAM but NVIDIA aren't with RTX and DLSS (features used sparingly and RTX which slaughters FPS)?
I'm just not buying it. We don't pay megabucks in PC-land to have the same/less usabale VRAM as a gaming console which costs a fraction of the cost.

I'm probably getting a 3080 but NVIDIA fanboys are clutching at straws for our gimped cards. Watch Dogs Legions (still a game which plays on last gen hardware) has some troubles with the 3XXX VRAM restricted cards which is requiring performance patches etc.

And manipulating the facts there. AMD's last generation card has 8GB of VRAM. All of their newly announced cards are 16GB.

My fully modded Skyrim cripples my 2080 at 4k due to VRAM limitations (love those photorealistic textures).
 
Associate
Joined
21 Oct 2013
Posts
2,059
Location
Ild
Consoles will target a maximum of 10GB to be used for graphics as they only have 16GB total. Most titles will use ~6GB for graphics. I'm sure if you have been following this thread that you will also have learned about hardware streaming data directly to the GPU, reducing the need for more VRAM to keep things running smoothly.

I'm not sure what you meant to say, but AMD have 8GB cards.

AMD pulled a smart marketing decision by doubling up on cheap VRAM. A lot of people are falling for it. The same people who do not understand how VRAM is used.
Consoles are not the benchmark of quality when you are spending £500 on a GPU though.
 
Soldato
Joined
6 Feb 2019
Posts
17,464
How many "4k" games are actually using 4k assets? the raw resolution of the asset has a massive impact on vram requirement and just upscaling the asset to match whatever screen output you chose doesn't make the asset quality any better.

This becomes even more apparent when you try to play games at 8k, in Linus's 8k video he pointed out that many of the Doom Eternal's textures look at like absolute turd because they are clearly at a much much lower pixel count and then just getting stretched out
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
Yeah probably. But I doubt it will be as bad as say Fury was. I am happy AMD went with 16gb. It is now certain that when I upgrade to either Hopper or Arcturus in a couple of years time it will be no less than 16gb. A RTX 4070 will likely offer 16gb and at least match 3090 performance would be my guess.

That said, we will likely see a 3070 with 16gb before long. Question is how much more will they charge for it.

Which card are you getting? :)
3080 TUF OC on backorder, but I'm keeping my eye on if NVIDIA react to the great competition AMD have put on the table.

Had to pickup a cheap 2080Ti for Cyberpunk after Ampere landed... depending on how many years it takes for the 3080 to ship, it may be cancelled :)
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
So AMD are pulling smart marketting tricks with their VRAM but NVIDIA aren't with RTX and DLSS (features used sparingly and RTX which slaughters FPS)?
I'm just not buying it. We don't pay megabucks in PC-land to have the same/less usabale VRAM as a gaming console which costs a fraction of the cost.

Just ask yourself what would people be talking about now if AMD had launched 8/10GB cards also?

We were talking about VRAM. What has that got to do with DLSS and raytracing? In general, we pay for a chip with more silicon containing less defects.

I'm probably getting a 3080 but NVIDIA fanboys are clutching at straws for our gimped cards. Watch Dogs Legions (still a game which plays on last gen hardware) has some troubles with the 3XXX VRAM restricted cards which is requiring performance patches etc.

Does it have an issue with VRAM? Are they really ditching all those 1080/2080/5700XT/etc. customers?

And manipulating the facts there. AMD's last generation card has 8GB of VRAM. All of their newly announced cards are 16GB.

Their lower end model has 8GB.

My fully modded Skyrim cripples my 2080 at 4k due to VRAM limitations (love those photorealistic textures).

Not the 1st to mention Skyrim. I have no idea how it handles VRAM. Maybe we will get lucky and see a Skyrim RT launch. Was the 2080 a 4k card? I thought the 2080Ti was considered the 4k model due to having more bandwidth and a more powerful GPU.

Both consoles, AMD and Nvidia are looking to stream such data direct to the GPU so reducing the need for large amounts of VRAM.

I'm probably sticking with my 3080 order as I'm after RT, which I think will only work well below 4K no matter how much VRAM it has. If AMD could provide equal or better RT support then I'd swap as I prefer AMD as a company.
 
Associate
Joined
25 Sep 2020
Posts
128
Yeah probably. But I doubt it will be as bad as say Fury was. I am happy AMD went with 16gb. It is now certain that when I upgrade to either Hopper or Arcturus in a couple of years time it will be no less than 16gb. A RTX 4070 will likely offer 16gb and at least match 3090 performance would be my guess.

That said, we will likely see a 3070 with 16gb before long. Question is how much more will they charge for it.

Which card are you getting? :)

https://videocardz.com/newz/nvidia-preparing-geforce-rtx-3080-ti-with-9984-cuda-cores

Those were scrapped off.

Newly leaked 3070ti with 10gb g6x and 3080ti with 12gb g6x
 
Soldato
Joined
19 May 2012
Posts
3,633
Just ask yourself what would people be talking about now if AMD had launched 8/10GB cards also?

We were talking about VRAM. What has that got to do with DLSS and raytracing? In general, we pay for a chip with more silicon containing less defects.



Does it have an issue with VRAM? Are they really ditching all those 1080/2080/5700XT/etc. customers?



Their lower end model has 8GB.



Not the 1st to mention Skyrim. I have no idea how it handles VRAM. Maybe we will get lucky and see a Skyrim RT launch. Was the 2080 a 4k card? I thought the 2080Ti was considered the 4k model due to having more bandwidth and a more powerful GPU.

Both consoles, AMD and Nvidia are looking to stream such data direct to the GPU so reducing the need for large amounts of VRAM.

I'm probably sticking with my 3080 order as I'm after RT, which I think will only work well below 4K no matter how much VRAM it has. If AMD could provide equal or better RT support then I'd swap as I prefer AMD as a company.


My 2080 can handle Skyrim 4K find but when the textures get heavy and exceeds 8gb it gets utterly destroyed
 
Associate
Joined
17 Sep 2020
Posts
624
I can end this debate that endlessly loops the same tired old question and arguments that are the same page 18 as page 1.


Don't buy the 8gb cards your c0ck will fall off, mother in law will move in and your PC will burst into flames, its that serious!

Also doesn't matter if 8Gb is enough or not they all sold out in less than a second so the whole threads pointless, the cards unobtainable and by the time it's available again it will be superseded and the price is at LEAST over £100 more than msrp and the OC versions that offer NOTHING are 3080 money.

So no don't buy the 3070, its not a good price, its not available and that invalidates the question of 8GB being enough period. **Lock Thread**
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
What's happened to hair, 18:45? Do you really expect to run a 3070 at 4k max? I'd have thought the 3070 was for 1440p given it's reduced GPU and bandwidth.

It's just a bit of a shame is all. The 3070 is perfectly capable of 4k from a power perspective. However, you can't utilise that power because it won't have enough VRAM. I was just more hopeful that this round would finally bring in affordable 4k gaming, but I guess Nvidia want you to buy a 3080 really badly. At massively inflated prices over RRP.

BTW as for the game? I watched a full video review last night (not going to get it, doesn't sound like my bag) and they were saying it's not even a full next gen title. Seems to be half previous gen with half next gen added on.

This situation won't get better. Those who think it will? obviously haven't been in this game very long.
 
Associate
Joined
25 Sep 2020
Posts
128
I can end this debate that endlessly loops the same tired old question and arguments that are the same page 18 as page 1.


Don't buy the 8gb cards your c0ck will fall off, mother in law will move in and your PC will burst into flames, its that serious!

Also doesn't matter if 8Gb is enough or not they all sold out in less than a second so the whole threads pointless, the cards unobtainable and by the time it's available again it will be superseded and the price is at LEAST over £100 more than msrp and the OC versions that offer NOTHING are 3080 money.

So no don't buy the 3070, its not a good price, its not available and that invalidates the question of 8GB being enough period. **Lock Thread**
Nah 3070 launch went way better than 3080 i know a lot of people that got the cards..
 
Back
Top Bottom