• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

So you dispute his statement that the 2080 was crippled by its 8GB of VRAM in doom eternal and that Nvidia knew this and used it to try and sell everyone that the 3080 is twice as fast as the 2080? Okay


I don't remember anyone giving a timeline but it could be after I checked out of the thread.

However I do remember some of your examples, for example the HZD issue was about someone going from a 5700XT to 6800XT. I don't remember anyone using it as an example of the 3080 VRAM issue, but it was used to raise the question of game engines automatically trying to account for VRAM issues by lowering textures, even if this instance it was a bug of the system dowgrading the textures on the wrong object.
Godfall did use more than 10GB when you activated all the bells and whistles including the AMD bolt ons. Did this change?
Regarding the deathloop problem, I didn't pay close attention but I remember in that thread it was like watching you have an argument with yourself because you kept bringing up points that people were not talking about. TBF I could have just missed those points.

I don't know about the rest of them but something tells me you've stretched the truth like Nvidia's markerting talking about the 3080.


Now you are being disengenious or do you really believe that a feature that needs to be designed and baked into the GPU core was intentianlly designed to make them look bad in benchmarks (and potentially damage sales) because it would be more important in selling a product 2 years down the line . Wait hold on, Is this you secretly admitting that RT didn't acutally matter back in 2020. :eek:
Re-read my post....

And see this bit:

It's getting late so have skimmed above post but "crippling 2080 to make the 3080 look better than it really is"........ Can I have some of what you're smoking please :cry: A 3080 destroys a 2080 and beats an extreme OC 2080ti too.... Not remember the footage of doom eternal (another "supposed" vram problematic game...)

I can't recall but was it not the 2080ti used for the comparisons? Either way, point still stands, a 3080 destroys a 2080 and comfortably beats a heavily OC 2080ti too despite having more vram.

Doom eternal is actually one of the best games for vram optimisation/usage too i.e. it will play with little to no stuttering or fps drops because it is properly managing vram allocation.


Even early on in the thread you had people "insisting" that within the year we would be seeing numerous titles showing issues because of vram.


People definitely did use that HZD as an example for "vram issues", Matt being the main one to illustrate vram issues despite again, there being several patches to address "bugs" related to vram/memory management as it was even happening on gpus with large amounts of vram. It wasn't a 3080 specific thing but it was used as "evidence" to support the narrative, funny thing is the video I linked to of a 3080 vs 6800xt actually looked to have texture rendering issues on the 6800xt but of course, it was ignored, same way Matts RE village low res textures was ignored (sorry I forgot, video/yt encoding.....) along with bang4bucks 3090 frame spikes :cry:

Godfall:

64JABaz.png


Much wow....

Also look at bang4bucks lovely frame latency on the 3090 (so above wasn't just a case of RT crippling amd) i.e. the game runs bad regardless of vram amount.

Deathloop wasn't a big one like FC 6 but you had 1 or 2 guys post a screenshot of their 3090 using 20gb vram making some comment, "glad to have loads of vram" when reality is, it was just fully allocating the vram but making no difference to the performance gain.

But as you said yourself:

it could be after I checked out of the thread.

So bit silly to make this statement if you didn't even keep up to date and read the thread fully.....

I don't know about the rest of them but something tells me you've stretched the truth like Nvidia's markerting talking about the 3080.


Again, do you think at the "time of launch" it was possible for nvidia to provide more than 10GB of vram whilst keeping the cost at £650? What has proved to be of more value since rdna 2/ampere launch, the extra vram or the extra RT perf? Obviously this is subjective depending on ones usage/needs but again, I think you'll find it's much easier to list RT titles where amd suffer far more than it is to list vram titles where a 3080 struggles, one should compile a list ;)

Guessing you are skimming posts again.....

At the time of my 3080 purchase I didn't care "that" much for RT nor dlss tbf, it was only really cp 2077 and control which had worthy RT and really "needed" dlss but a 3080 FE worked out cheaper and easier than trying to get a 6800xt at msrp, not to mention 3080 is matching a 6800xt in non RT workloads anyway so RT tensor cores + dlss were just a bonus, which given that I played more RT titles than rasterization along with dlss, it worked out very well and a much better buy than I was expecting in the end, if I had picked up a 6800xt, personally I would have regretted it massively and probably tried to switch over to a 3080 asap after seeing all the games I am interested in getting rt and dlss.

Given how far RT has come in the last 2 years, I'm very happy AMD didn't have their UK store setup here ;)
 
"crippling 2080 to make the 3080 look better than it really is"


A 3080 destroys a 2080 and beats an extreme OC 2080ti too....

You do know that both of the above statements can be true?
If Nvidias PR team made it look like it was 90% faster but it was only 60% faster you could still say it "destorys XYZ" while acknowledging that they crippled it to make it look faster.

I can't recall but was it not the 2080ti used for the comparisons?
Officially it was compared to the 2080. Everyone said it should be compared to the 2080ti since the GPU core was the same class.


I remember FC6, that was the game where people couldn't understand why a jungle setting took up more VRAM than a city setting.

Again, do you think at the "time of launch" it was possible for nvidia to provide more than 10GB of vram whilst keeping the cost at £650?
With so few variables in your question the answer is yes.
What you meant to say was could they do that while maintaining their fat profit margins to which the answer is no. But I don't care about their fat profit margins because it is of no benefit to me or anyone else buying GPUs.

What has proved to be of more value since rdna 2/ampere launch, the extra vram or the extra RT perf? Obviously this is subjective depending on ones usage/needs but again, I think you'll find it's much easier to list RT titles where amd suffer far more than it is to list vram titles where a 3080 struggles, one should compile a list ;)
Your question is pointless unless you treat GPUs as if they only have a 2 year shelf life. To properly answer the question you would need to view the full life cycle, which has yet to occur ;)
As far as i am concerned by the time RT is good enough to be a must have feature for a majority of gamers, the RT performance on these current cards simply won't be enough to run the games.

Godfall did use more than 10GB when you activated all the bells and whistles including the AMD bolt ons

I'm assuming this is the video from your screenshot:

Capture.jpg



unsettled-tom-meme.jpg
 
You do know that both of the above statements can be true?
If Nvidias PR team made it look like it was 90% faster but it was only 60% faster you could still say it "destorys XYZ" while acknowledging that they crippled it to make it look faster.


Officially it was compared to the 2080. Everyone said it should be compared to the 2080ti since the GPU core was the same class.


I remember FC6, that was the game where people couldn't understand why a jungle setting took up more VRAM than a city setting.


With so few variables in your question the answer is yes.
What you meant to say was could they do that while maintaining their fat profit margins to which the answer is no. But I don't care about their fat profit margins because it is of no benefit to me or anyone else buying GPUs.


Your question is pointless unless you treat GPUs as if they only have a 2 year shelf life. To properly answer the question you would need to view the full life cycle, which has yet to occur ;)
As far as i am concerned by the time RT is good enough to be a must have feature for a majority of gamers, the RT performance on these current cards simply won't be enough to run the games.



I'm assuming this is the video from your screenshot:

Capture.jpg



unsettled-tom-meme.jpg

Can you link me to the nvidia PR video/slides where they compared the 3080 to the 2080?

Was it somewhere in the official reveal?


I hope you're being sarcastic but just in case...... you do realise there is far more to a games vram usage than just the environments, please tell me you do.... It's a variety of things i.e. unique assets and not just reusable assets copy and pasted throughout the game world (see days gone for an example of this), resolution of said textures and density/how much of these assets are crammed into the viewable scene.

It's called keeping it a simple question, there is no way it was possible for nvidia to provide more than 10gb vram without either cutting costs elsewhere or/and increasing vram whilst adding a considerable bump to the price tag. Do you think the 3080 would have been as popular as it was had it been priced even higher just to have extra vram? (£650 was already pushing it for a lot of people, any more expensive and I wouldn't have bothered either)

Well we already know what way one brand of gpus is going given the sacrifices that have had to be made from day 1 in numerous games let alone in recent heavier RT titles, meanwhile, vram heavy titles.....

*insert johntravolta.gif*


Screenshot? Look closer, it's a 6800xt :cry:


Also, not sure why you are highlighting the AMD fidelityfx LPM option, you do realise it's to do with hdr:


Also, no that wasn't the 3090 video I was referring to either. It was posted in the 10gb thread.
 
This was me trying FC6 with the HD texture pack.

RTX 2070 Super 8GB.

The game even felt mushy, there was lag, its why my aim was so bad.

 
Can you link me to the nvidia PR video/slides where they compared the 3080 to the 2080?

Was it somewhere in the official reveal?
30 min mark of your video.

I hope you're being sarcastic but just in case...... you do realise there is far more to a games vram usage than just the environments, please tell me you do.... It's a variety of things i.e. unique assets and not just reusable assets copy and pasted throughout the game world (see days gone for an example of this), resolution of said textures and density/how much of these assets are crammed into the viewable scene.
This phrase unique assests seemed awfully familiar so I did search and found these posts

Glad to see part of my post stuck with you. I'll ignore the fact that you misrepresented what I said.

It's called keeping it a simple question, there is no way it was possible for nvidia to provide more than 10gb vram without either cutting costs elsewhere or/and increasing vram whilst adding a considerable bump to the price tag. Do you think the 3080 would have been as popular as it was had it been priced even higher just to have extra vram? (£650 was already pushing it for a lot of people, any more expensive and I wouldn't have bothered either)

Cutting costs? Your talking like Nvidia has tiny ass margins on these things. doesn't Nvidia have one of the highest margins in the hardware industry. I think they are only beaten by Apple.

Also, not sure why you are highlighting the AMD fidelityfx LPM option, you do realise it's to do with hdr:
You missed a part in my post see below. I know it was for tonemapping. It increased VRAM for some reason bumped it over 10GB.
Godfall did use more than 10GB when you activated all the bells and whistles including the AMD bolt ons.

The video in question still doesn't have the setting enabled.
 
Mhh..... i hate that, as if AMD are asking game developers to make add-ons that if used cripple most of Nvidia's range.

Most games these days are made first and foremost for Consoles, they are then ported to PC, the PS5 has 16GB of memory, so ironically game developers have much more freedom here for those nice high res textures, which have to be taken out and replaced with lower res textures to accommodate our crappy PC's.

Thanks Nvidia...
 
So are we just going to pretend like Nvidia didn't come on stage and tell everyone the 3080 is twice as fast as the 2080 and used doom eternal as one of the benchmarks to prove this. Or did you misread the post you quoted because you were too busy thinking up a joke about smoking weed?


Nobody said that, maybe you need to lay off the ganja ;) . People said as time goes on more games with higher VRAM requirements will come out. Stop acting like 2 years is a long time in the context of AAA game development.
Obviously when those games come out you will start saying how everyone who owns a 3080 should have upgraded to the 4000 series anyway. Thereby proving what everyone said was right. It is an artifical limitation to get people to upgrade GPUs.

So you dispute his statement that the 2080 was crippled by its 8GB of VRAM in doom eternal and that Nvidia knew this and used it to try and sell everyone that the 3080 is twice as fast as the 2080? Okay

However I do remember some of your examples, for example the HZD issue was about someone going from a 5700XT to 6800XT. I don't remember anyone using it as an example of the 3080 VRAM issue, but it was used to raise the question of game engines automatically trying to account for VRAM issues by lowering textures, even if this instance it was a bug of the system dowgrading the textures on the wrong object. Godfall did use more than 10GB when you activated all the bells and whistles including the AMD bolt ons. Did this change?

I don't know about the rest of them but something tells me you've stretched the truth like Nvidia's markerting talking about the 3080.

Now you are being disengenious or do you really believe that a feature that needs to be designed and baked into the GPU core was intentianlly designed to make them look bad in benchmarks (and potentially damage sales) because it would be more important in selling a product 2 years down the line . Wait hold on, Is this you secretly admitting that RT didn't acutally matter back in 2020. :eek:

Just catching up, its like deja vu from other threads! I eventually used the ignore though it was more productive discussing with a chunk of moss.
 
Hmm depending onw hat Calisto Protocol etc runs like, think I will go with a 4080 Ti if the 3080 Ti is unable to maintain maximum RTX/settings at 3440x1440 in games like that later this year. I think it pretty damn well should though.

Thos waiting and thinking of getting a cheaper 30xx new, maybe not worth waiting because:

 
30 min mark of your video.


This phrase unique assests seemed awfully familiar so I did search and found these posts

Glad to see part of my post stuck with you. I'll ignore the fact that you misrepresented what I said.



Cutting costs? Your talking like Nvidia has tiny ass margins on these things. doesn't Nvidia have one of the highest margins in the hardware industry. I think they are only beaten by Apple.


You missed a part in my post see below. I know it was for tonemapping. It increased VRAM for some reason bumped it over 10GB.


The video in question still doesn't have the setting enabled.

So are there "zero" scenarios where a 3080 is twice as fast as a 2080 then? As in them slides/snippet, nothing was specifically mentioned about where exactly a 3080 is twice as fast as a 2080. Maybe we should take a look at amds marketing/pr slides again.... anyone remember "RAGE MODE"? :cry:

Misrepresented? You said in in black and white text:

I remember FC6, that was the game where people couldn't understand why a jungle setting took up more VRAM than a city setting.

i.e. indicating that the environment of a game dictates vram usage :cry:

But it seems you are aware of this and agreed with me as linked there so what was the purpose of stating this? Other than to argue for the sake of it.....

Got a link to show the vram usage increase? Also, I haven't looked into it too much but why would one use it if they don't have a HDR display and regardless, not think that is a bit odd a HDR setting increasing vram usage? I have 2 HDR displays and enabling HDR including messing with special k hdr application does not increase vram usage...... I had a quick google there and people reported it over saturating colours so not sure it is a setting that should be enabled even with a hdr display.... Once again, clutching at straws.

And either way, thoughts on that lovely 6800xt performance in godfall? Enabling the LPM option isn't going to help that out.....

Mhh..... i hate that, as if AMD are asking game developers to make add-ons that if used cripple most of Nvidia's range.

Most games these days are made first and foremost for Consoles, they are then ported to PC, the PS5 has 16GB of memory, so ironically game developers have much more freedom here for those nice high res textures, which have to be taken out and replaced with lower res textures to accommodate our crappy PC's.

Thanks Nvidia...

You do realise consoles are sacrificing various graphical settings along with RT as well as resolution (significantly) in order to hold a locked 60 fps?

Thanks sony/microsoft amd.....

Just catching up, its like deja vu from other threads! I eventually used the ignore though it was more productive discussing with a chunk of moss.

Good to see still adding so much to these threads/discussions :cry:

People would be better of talking to a pile of **** tbh :D
 
Hmm depending onw hat Calisto Protocol etc runs like, think I will go with a 4080 Ti if the 3080 Ti is unable to maintain maximum RTX/settings at 3440x1440 in games like that later this year. I think it pretty damn well should though.

Thos waiting and thinking of getting a cheaper 30xx new, maybe not worth waiting because:


You can be sure RT is going to be pushed further with newer titles especially nvidia sponsored titles, ampere will hold up ok with settings appropriately adjusted but who wants to be missing out on full RT goodness :cool:

Avatar in particular is going to be quite the showcase imo, ray traced only and on the snowdrop engine by massive :D
 
Since getting the 3080 Ti I have not played any game with settings reduced, it's all max or nothing really :p

Calisto Protocol is the game of the year for me to look forward to having been a big Dead Space fan, so this will be the one to show off the RTX effects and stuff too. Hopefully it uses Direct Storage or RTXIO too to make even greater use of current VRAM sizes too.
 
Since getting the 3080 Ti I have not played any game with settings reduced, it's all max or nothing really :p

Calisto Protocol is the game of the year for me to look forward to having been a big Dead Space fan, so this will be the one to show off the RTX effects and stuff too. Hopefully it uses Direct Storage or RTXIO too to make even greater use of current VRAM sizes too.
Yup imagine having to reduce settings which make a huge difference to the visuals and as a result the atmosphere of the game just to get playable framerates, oh the pain ;) :p

That Calisto Protocol does look great, didn't realise it was getting ray tracing though, shouldn't be surprised though given how many games are coming out with some form of RT now :p ;)

Deadspace remaster will be glorious too.

It's a shame RTX IO/direct storage hasn't taken of, maybe next year!
 
MS only released the API for Direct Storage recently, so devs will be churning away implementing it in new titles I have no doubt since it's effectively free performance gains and means games can be larger and more immersive as a result. I fully expect games coming out from later this year to at least use some form of it for streaming world assets and things. LTT did a really good video showing how much of a difference DS makes on vs off not long ago.

in one article I read they are also using acoustic ray tracing in this game too so that should be very interesting, and an interview answer stated by the devs that they are going to be using future PC technologies and more on that "later" - Hopefully that means from launch. they have plenty of time to implement still I guess
 
Since getting the 3080 Ti I have not played any game with settings reduced, it's all max or nothing really :p

Calisto Protocol is the game of the year for me to look forward to having been a big Dead Space fan, so this will be the one to show off the RTX effects and stuff too. Hopefully it uses Direct Storage or RTXIO too to make even greater use of current VRAM sizes too.

What resolution sir? I've studied benchmarks and it seems capable at 4k Ultra, but we all know a lot of settings can be turned down to high/medium with very little effect on visuals.
 
MS only released the API for Direct Storage recently, so devs will be churning away implementing it in new titles I have no doubt since it's effectively free performance gains and means games can be larger and more immersive as a result. I fully expect games coming out from later this year to at least use some form of it for streaming world assets and things. LTT did a really good video showing how much of a difference DS makes on vs off not long ago.

in one article I read they are also using acoustic ray tracing in this game too so that should be very interesting, and an interview answer stated by the devs that they are going to be using future PC technologies and more on that "later" - Hopefully that means from launch. they have plenty of time to implement still I guess
Do you kindly have the link to the video, if not I will try and find it. Thanks.

Looks like it was released in March.


Windows 10 and 11 are compatible, but 10 will need external libraries so games would need to ship the libraries which some might not bother.
 
  • Like
Reactions: mrk
So are there "zero" scenarios where a 3080 is twice as fast as a 2080 then?
Yeah I believe that was @Purgatory point. We got there in the end

Misrepresented? You said in in black and white text:
Yes you did.
Here is your post
you do realise there is far more to a games vram usage than just the environments,
I never said or implied that environments are the only thing to dictate VRAM usage.
I compared two specific settings/environments and said that one would be more VRAM intensive than the other. If you disagree and you believe that a city setting is more VRAM intensive that a jungle setting/environment or they would be the same. Then state why you think that.

You listed a bunch of items that affect VRAM usage, while seemingly ignoring or forgetting that the items you have listed are the attributes that dictate why certain settings/environments are more VRAM intensive than others.
Settings/environments first then you drill down to the specific attributes that affect VRAM.

A simplified example would be: A Desert scene will be less VRAM intensive than a city scene, because in general there are less items on screen at a given moment.
 
A lot of "ACTUALLY..." posts here :D this is awesome.

I ended up purchasing a Steam Deck so will wait and see if the 4000 series will be a good batch compared to the clustershaitshow that was the 3000 series.
 
Back
Top Bottom