• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Caporegime
Joined
12 Jul 2007
Posts
40,543
Location
United Kingdom
More leaks confirming the 3080ti 20GB is coming > https://videocardz.com/newz/kopite7...ature-10496-cuda-cores-and-20gb-gddr6x-memory

The army of pending 3080 owners still deny the truth, as they can't accept a 3080ti is coming so soon. Did they really think Nvidia would allow AMD Radeon to have vastly superior cards, in terms of VRAM?

The above is in relation to running 4k on a 3080. Lower resolutions are IMO pointless (as this is a 4k card) and wasteful, as the previous gen was fine for 1080/1440P.
Well no one can say 16/20GB isn't enough video memory that's for sure.
 
Associate
Joined
3 Nov 2020
Posts
26
More leaks confirming the 3080ti 20GB is coming > https://videocardz.com/newz/kopite7...ature-10496-cuda-cores-and-20gb-gddr6x-memory

The army of pending 3080 owners still deny the truth, as they can't accept a 3080ti is coming so soon.

This is a bizarre comment, I paid £650 for my FE, I wouldn't pay a penny more, it's a graphics card not a car. I have no doubt there is either a 3080 Super or TI on the horizon, I also have no doubt that it will cost at least £850 for a FE which for me and I suspect many others will be too much to spend on a toy. The AIB's will be pushing £1000 and at those prices the wife would have me sleeping in the garden let alone on the sofa.

If it came out in the next few months at £650 I would be annoyed for sure. But I don't see it.
 
Associate
Joined
15 Jun 2009
Posts
2,189
Location
South London
Sorry to say, as it goes against the "pallet of salt talk" as soon as any Leaks come out and as unbelievable as it sounds, Videocardz and Wccftech were pretty close to the mark when it came to the rumoured specifications leading up to the Ampere launch.

The only thing that completely threw everyone off is NVIDIA claiming double the CUDA cores with an advanced architecture.

Nothing is confirmed until NVIDIA put it on the table, but surely 3080 owners can understand that they refuse to be second best with AMD. A big part of their business strategy is to have the best at all cost.

6900XT has spoiled the Ampere party I'm afraid at the price of $999... I think a 3080Ti is genuinely on the cards to fight back to AMD at a price that is more competitive than the 3090.
 
Associate
Joined
9 May 2007
Posts
1,284
More leaks confirming the 3080ti 20GB is coming > https://videocardz.com/newz/kopite7...ature-10496-cuda-cores-and-20gb-gddr6x-memory

The army of pending 3080 owners still deny the truth, as they can't accept a 3080ti is coming so soon. Did they really think Nvidia would allow AMD Radeon to have vastly superior cards, in terms of VRAM?

The above is in relation to running 4k on a 3080. Lower resolutions are IMO pointless (as this is a 4k card) and wasteful, as the previous gen was fine for 1080/1440P.

Evidence at the moment is 10GB is enough for current and for games that are to be released (evidence already posted in the thread). This could change. Stating one game that is not even released as an example when its ment to use an extreme HD texture pack is not really proof. Its called cherry picking. Its called biaed when that is an AMD sponsored game designed to run on AMD hardware as a showcase. Shouting about it and stating 10GB is not enough is ridiculous.

Bet when all is said and done it runs fine on the RTX 3080.

https://videocardz.com/newz/nvidia-preparing-geforce-rtx-3080-ti-with-9984-cuda-cores
https://videocardz.com/newz/kopite7...ature-10496-cuda-cores-and-20gb-gddr6x-memory
Now there is a 3080 ti 20GB and you post a link. What does that link state at the top?
Please note that this post is tagged as a rumor.

Yet you state 10496 cuda cores or the same as the RTX 3090. So basically a RTX 3090 with less memory bandwidth and 4GB less memory.

Yet people ignore this https://videocardz.com/newz/amd-ray...dia-rt-core-in-this-dxr-ray-tracing-benchmark

The Radeon RX 6800XT has 72 RAs while GeForce RTX 3080 has 68 RT Cores. AMD has conducted tests using Microsoft DXR SDK tool called ‘Procedural Geometry

AMD:

Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary. RX-571

A Redditor NegativeXyzen tested ASUS RTX 3080 TUF performance and reported an average framerate of 630

We have also ran the benchmark on the GeForce RTX 3080 with stock clock and power targets using a precompiled executable file. The RTX 3080 scored 635 framerate on average from three 1 minute runs.

This means that GeForce RTX 3080 is around 33% faster than Radeon RX 6800 XT in this particular benchmark. Of course, there are few things that need to be taken into the account such as the testing platform and DXR benchmark resolution (AMD did not clarify the resolution, hence we used a default one).

If this is true, AMD cards will be 1080p/1440p in Control which is what a 2080 ti can do. 16GB is enough for 1080p/1440p right?
 
Last edited:
Soldato
Joined
31 Oct 2002
Posts
9,860
Evidence at the moment is 10GB is enough for current and for games that are to be released (evidence already posted in the thread). This could change. Stating one game that is not even released as an example when its ment to use an extreme HD texture pack is not really proof. Its called cherry picking. Its called biaed when that is an AMD sponsored game designed to run on AMD hardware as a showcase. Shouting about it and stating 10GB is not enough is ridiculous.

Bet when all is said and done it runs fine on the RTX 3080.

https://videocardz.com/newz/nvidia-preparing-geforce-rtx-3080-ti-with-9984-cuda-cores
https://videocardz.com/newz/kopite7...ature-10496-cuda-cores-and-20gb-gddr6x-memory
Now there is a 3080 ti 20GB and you post a link. What does that link state at the top?


Yet you state 10496 cuda cores or the same as the RTX 3090. So basically a RTX 3090 with less memory bandwidth and 4GB less memory.

Yet people ignore this https://videocardz.com/newz/amd-ray...dia-rt-core-in-this-dxr-ray-tracing-benchmark







If this is true, AMD cards will be 1080p/1440p in Control which is what a 2080 ti can do. 16GB is enough for 1080p/1440p right?


This isn't a court case buddy. It's very simple - the 3080 is supposed to be the flagship, yet has less VRAM than the 1080ti, that released 3.5 years ago.

Games are already pressing up against the 10GB limit at 4k, such as Doom eternal that needs between 8 and 10GB. Flight Sim 2020, that needs more than 10GB.

Then you have next gen games like Godfall, that is officially confirmed by the developer to require 12GB VRAM. Do you know more that a developer?

The next gen consoles have double the total memory over the last generation of consoles. This means the developers are making use of the double total memory, this will carry over to PC ports, require more VRAM than previous games.

It seems Nvidia also agree, else why would they be putting 20GB on the 3080ti? Why not 10GB?....

The 3080ti will be an absolutely fantastic card, I'm excited to buy it when it releases. I'm going to need a HDMI 2.1 card to tie me over until it releases though, so will be picking up whatever card I can find in stock!
 
Associate
Joined
3 Nov 2020
Posts
26
Then you have next gen games like Godfall, that is officially confirmed by the developer to require 12GB VRAM. Do you know more that a developer?

Sad to hear my 3080 won't hit minimum specs and the game will be unplayable.

In all seriousness though will be fine at 3440x1440 and at 4K will probably need to be run at high textures rather than ultra with no discernible difference.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Then you have next gen games like Godfall, that is officially confirmed by the developer to require 12GB VRAM.
That is really misleadingly worded. Godfall doesn't require 12GB VRAM, it has a specific Ultra mode with 4k x 4k textures, developed in conjunction with AMD, and the devs say that to run this specific mode uses 12GB of VRAM.
 
Associate
Joined
9 May 2007
Posts
1,284
Sad to hear my 3080 won't hit minimum specs and the game will be unplayable.

In all seriousness though will be fine at 3440x1440 and at 4K will probably need to be run at high textures rather than ultra with no discernible difference.

The real minimum specs https://www.pcgamer.com/uk/godfall-pc-system-requirements/

Minimum

  • OS: Windows 10
  • CPU: AMD Ryzen 5 1600 or Intel Core i5-6600
  • GPU: AMD Radeon RX 580 8GB or Nvidia GeForce GTX 1060 6GB
  • RAM: 12GB
So your RTX 3080 does not beat a GTX 1060 6GB?
 
Associate
Joined
9 May 2007
Posts
1,284
This isn't a court case buddy. It's very simple - the 3080 is supposed to be the flagship, yet has less VRAM than the 1080ti, that released 3.5 years ago.

Games are already pressing up against the 10GB limit at 4k, such as Doom eternal that needs between 8 and 10GB. Flight Sim 2020, that needs more than 10GB.

Then you have next gen games like Godfall, that is officially confirmed by the developer to require 12GB VRAM. Do you know more that a developer?

The next gen consoles have double the total memory over the last generation of consoles. This means the developers are making use of the double total memory, this will carry over to PC ports, require more VRAM than previous games.

It seems Nvidia also agree, else why would they be putting 20GB on the 3080ti? Why not 10GB?....

The 3080ti will be an absolutely fantastic card, I'm excited to buy it when it releases. I'm going to need a HDMI 2.1 card to tie me over until it releases though, so will be picking up whatever card I can find in stock!

There is no RTX 3080 ti, it does not exist. NVidia have rejected all the rumors of more vRAM so far. The rumors are just that rumors.
 
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
I notice that people are NOT saying that the "Flagship card must be able to run all games maxed at 4K".

Instead they seem willing to accept a card that cannot max settings on all games....But if the reason they have to turn down a setting is vram, well, now *this* is just unacceptable!

A 3090 can't run my favorite sim maxed out in VR and that game doesn't even use 8gb of vram. I'm not calling for everyone to grab their torches and pitchforks though.
 
Associate
Joined
9 May 2007
Posts
1,284
I notice that people are NOT saying that the "Flagship card must be able to run all games maxed at 4K".

Instead they seem willing to accept a card that cannot max settings on all games....But if the reason they have to turn down a setting is vram, well, now *this* is just unacceptable!

A 3090 can't run my favorite sim maxed out in VR and that game doesn't even use 8gb of vram. I'm not calling for everyone to grab their torches and pitchforks though.

Witcher 2 can't be run 4k @ 60 fps with all the settings turned on. It just melts any GPU that tries. Guess all the GPU's need more vRAM for Witcher 2 a game released in 2011. Guess we were all willing to accept cards that could not and most likely still cant run Witcher 2 at max settings.

Its not like if you need more vRAM you cant just use 8:1 compression.
 
Last edited:
Associate
Joined
10 Mar 2013
Posts
1,391
Location
Plymouth
That is really misleadingly worded. Godfall doesn't require 12GB VRAM, it has a specific Ultra mode with 4k x 4k textures, developed in conjunction with AMD, and the devs say that to run this specific mode uses 12GB of VRAM.
Almost certainly to rub in Nvidias face

So your RTX 3080 does not beat a GTX 1060 6GB?
He was being sarcastic
 
Soldato
Joined
12 May 2014
Posts
5,236
That is really misleadingly worded. Godfall doesn't require 12GB VRAM, it has a specific Ultra mode with 4k x 4k textures, developed in conjunction with AMD, and the devs say that to run this specific mode uses 12GB of VRAM.
What specifically did they say? Because the term 4k textures is ambiguous. Are all textures 4k? Are some textures 4k?
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Almost certainly to rub in Nvidias face
You say that, and it maz be true, but considering that the 2080Ti had 11GB 2 years ago is it really unreasonable that a nex-gen engine with a crazy ultra mode uses more than 10GB VRAM? It's on the border for whether they did try to rub it in Nvidias face or whether that's just how much the mode used when they had finished with it.

Either way, we all know Nvidia skimped on the VRAM and the launch owners will be the ones to suffer from that next year.
 
Status
Not open for further replies.
Back
Top Bottom