• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

According to gamegpu, at 4k rt ultra the 10gb 3080 gets double the minimums and 10% better averages than the 7900XTX....but nevermind that, lets just keep bashing on the 3080 and the 4070ti

Seriously?! :eek:

Link/source!

Either way, even if 3080/4070ti get hit hard because of vram so does a 7900xtx despite having 24GB vram.......

0tB7S8m.png

Obviously the game is in need of some patching.


EDIT:

Has anyone got perf. figures for RDNA 2?
 
Last edited:
  • Haha
Reactions: TNA
Call me a grumpy old git but i think the idea is that for Hogwarts, once you patch the game with DLSS3.0 and FG, everything runs just fine at 4k with RT on :)

If you can't run the DLSS3.0+FG patch then you are out of luck at 4k with RT on.

Even the 4070ti is smoking in 4k with RT on with DLSS3.0+FG.
 
Last edited:
And im telling you as long as your original framerate is above 50, you dont feel any extra input latency then you would if you played at 50 in the first place. If i sit you down in a game youve never played before and i had fg on you would never go "this is laggy".

I also have a 120 hz monitor, so what?
You obviously don't know how input lag works if you think you can tell someone else how they would perceive it.
 
Call me a grumpy old git but i think the idea is that for Hogwarts, once you patch the game with DLSS3.0 and FG, everything runs just fine at 4k with RT on :)

If you can't run the DLSS3.0+FG patch then you are out of luck at 4k with RT on.

Even the 4070ti is smoking in 4k with RT on with DLSS3.0+FG.

And so it's a good thing right? Especially as games are released in beta states - the tech is there to help get the game comfortably playable with minimum IQ loss (some better implemented than others).

Would people rather DLSS & FG or spend ££££'s on a GPU with lots of VRAM - though I don't think VRAM is wholly the issue with many GPU hungry games an dis more optimisation. Stupid peopl ekeep paying MSRP for the beta experience.

RT is just a nice to have but now your Gfx cards now come with FSR/DLSS2.0> to help you play your new game (you paid MSRP for) for until they optimise it further - and you'd have prolly 'completed it mate' by then anyway.
 
Seriously?! :eek:

Link/source!

Either way, even if 3080/4070ti get hit hard because of vram so does a 7900xtx despite having 24GB vram.......

0tB7S8m.png

Obviously the game is in need of some patching.


EDIT:

Has anyone got perf. figures for RDNA 2?
Yeah, seriously, runs much better on a 3080 than a 7900xtx, megalol. But lets bash nvidia cause why not

 
Maybe but still the game needs dlss on both the 3090 and the 4070ti, because regardless of vram, it performs like crap anyways. I mean the 3090 has 24gb, are you going to play the game at 33 fps?

Thats why i always thought the vram complaints were always childish. Vram never was and never will be the reason you upgrade a card, cause always it will be something else that tanks your perfornance before that (at settings you would actually play the game)

Not childish there is games that have VRAM bottlenecks, and if you enjoy using high res / custom textures then VRAM raises in importance, also more and more desktop apps now use VRAM due to been GPU accelerated.

Not everyone has 3 digit frame rates as their criteria.

I think a good rule is to consider what the consoles have as a base line.

What's becoming evident in both processing power and VRAM usage, is dev's are getting worse at optimising their games, RT reducing their workload on lighting, better GPU's and DLSS/FSR another means of them not optimising as well.

A recent review on DF showed a stutter fest in a port of a new game, now apparently fixed with a patch (optimising), games like the latest far cry and FF7 remake had elevated VRAM requirements due to them adopting a unified memory approach putting things that would usually go in system ram in VRAM instead.

This meta of people assuming a want of VRAM only for epeen reasons needs to stop, because it just further encourages Nvidia to under spec their cards.

For reference I disable GPU acceleration on discord/firefox/telegram purely to save precious VRAM.
 
Last edited:
Not childish there is games that have VRAM bottlenecks, and if you enjoy using high res / custom textures then VRAM raises in importance, also more and more desktop apps now use VRAM due to been GPU accelerated.

Not everyone has 3 digit frame rates as their criteria.

I think a good rule is to consider what the consoles have as a base line.

What's becoming evident in both processing power and VRAM usage, is dev's are getting worse at optimising their games, RT reducing their workload on lighting, better GPU's and DLSS/FSR another means of them not optimising as well.

A recent review on DF showed a stutter fest in a port of a new game, now apparently fixed with a patch (optimising), games like the latest far cry and FF7 remake had elevated VRAM requirements due to them adopting a unified memory approach putting things that would usually go in system ram in VRAM instead.

This meta of people assuming a want of VRAM only for epeen reasons needs to stop, because it just further encourages Nvidia to under spec their cards.

For reference I disable GPU acceleration on discord/firefox/telegram purely to save precious VRAM.
But that doesn't seem to be the case, hogwarts runs better on a 3080 10gb than a 7900xtx. The minimums are freaking double on the 3080
 
Not childish there is games that have VRAM bottlenecks, and if you enjoy using high res / custom textures then VRAM raises in importance, also more and more desktop apps now use VRAM due to been GPU accelerated.

Not everyone has 3 digit frame rates as their criteria.

I think a good rule is to consider what the consoles have as a base line.

What's becoming evident in both processing power and VRAM usage, is dev's are getting worse at optimising their games, RT reducing their workload on lighting, better GPU's and DLSS/FSR another means of them not optimising as well.

A recent review on DF showed a stutter fest in a port of a new game, now apparently fixed with a patch (optimising), games like the latest far cry and FF7 remake had elevated VRAM requirements due to them adopting a unified memory approach putting things that would usually go in system ram in VRAM instead.

This meta of people assuming a want of VRAM only for epeen reasons needs to stop, because it just further encourages Nvidia to under spec their cards.

For reference I disable GPU acceleration on discord/firefox/telegram purely to save precious VRAM.
yep i had to turn of gpu acceleration on browsers and steam when i had my 3080 cause firefox would hog 2gb vram and never let it go without restarting browser
 
In this case it looks like the TI would probably have the grunt to run it (4K with RT) but can't due to lack of video memory. The 3090 can run it better, but I agree 33 FPS is not ideal, I guess you'd have to enable DLSS. Hopefully he tests all these scenarios. If the 3090 is getting 33 FPS, what is the 3080 10GB getting? Would it be 10% or so lower, or much lower due to lack of memory? That's what we'd all like to know. :cry:
Ufxax7q.png
Curious if this was mentioned in the 4070ti review they did?
 
But that doesn't seem to be the case, hogwarts runs better on a 3080 10gb than a 7900xtx. The minimums are freaking double on the 3080
I said both processing and VRAM, the DF review isnt a VRAM example. I dont think it was Hogwarts though.

Whats this about? Forsaken apparently is VRAM heavy as well?

Amazing how we went from 6 GB being enough to play Metro Exodus Enhanced Edition with RT on full blast, to 12 GB being the bare minimum to play Forspoken without textures that look like they came out of an N64 game.



Ultimately my point wasnt to say VRAM is king for everyone, clearly everyone has different priorities on what they want from their games, but rather to stop the disrespectful VRAM is childish/epeen type comments, I e.g. have said RT isnt important to me, but I have never called RT users childish, I respect people have it as a priority. I think the 4000 series is good hardware, the main issue is the pricing for me. The 4070ti is a bit tight on VRAM for my liking though so personally if I invested it would be the 4080 or 4090. But if 12 gigs is enough for you, then I think the 4070ti isnt a bad buy, especially compared to current 3080 prices.
 
Last edited:
I said both processing and VRAM, the DF review isnt a VRAM example. I dont think it was Hogwarts though.

Whats this about? Forsaken apparently is VRAM heavy as well?




Ultimately my point wasnt to say VRAM is king for everyone, clearly everyone has different priorities on what they want from their games, but rather to stop the disrespectful VRAM is childish/epeen type comments, I e.g. have said RT isnt important to me, but I have never called RT users childish, I respect people have it as a priority. I think the 4000 series is good hardware, the main issue is the pricing for me. The 4070ti is a bit tight on VRAM for my liking though so personally if I invested it would be the 4080 or 4090. But if 12 gigs is enough for you, then I think the 4070ti isnt a bad buy, especially compared to current 3080 prices.
Regarding Forsaken, someone else (just before the thread got locked) mentioned (in the 12GB Vram thread) that the textures become terrible if your graphics card doesn't have enough video memory.
 
I think one way to solve the VRAM debate as clearly the community is split, is sell GPUs with these low VRAM capacities, but also make VRAM addon cards available that would go in another PCIE slot, but not sure if it would impact memory latency/bandwidth having to traverse the PCIE system. We have modular system RAM but still after all these years VRAM isnt modular.
 
In this case it looks like the TI would probably have the grunt to run it (4K with RT) but can't due to lack of video memory. The 3090 can run it better, but I agree 33 FPS is not ideal, I guess you'd have to enable DLSS. Hopefully he tests all these scenarios. If the 3090 is getting 33 FPS, what is the 3080 10GB getting? Would it be 10% or so lower, or much lower due to lack of memory? That's what we'd all like to know. :cry:
Ufxax7q.png

Well said mate. It is as I said all along. Will be running out of performance before vram.
 
It supports DLSS3, but frame generation probably can't overcome video memory limitations.

But thought you said DLSS 3 can't overcome low fps limitations like you observed in Cyberpunk? Therefor moot point no? :D
 
I think one way to solve the VRAM debate as clearly the community is split, is sell GPUs with these low VRAM capacities, but also make VRAM addon cards available that would go in another PCIE slot, but not sure if it would impact memory latency/bandwidth having to traverse the PCIE system. We have modular system RAM but still after all these years VRAM isnt modular.


Or just stop releasing games on established engines in beta states. VRAM issues are nothing new - but they came in the form of memory leaks prior, but it was the game then - now it's ap[p[arently the GPU - no wonder Nvidia and AMD are charging such high prices - with lemons paying MSRP for the state of games on release they must think have no issue in paying £1k+ for a gfx card. If you lik epayiong MSRP for a beta - then stop squeaking about the cost of a gfx card. People expecting more. Eggs/ appleshave doubled in price, but you still only get 6 eggs/apples. GPU expectation is somewhere around I want moaaaar for my money. If yuou lik epaying for rubbish games then expect rubbish performance. Stop buying games on release if they have poor performanced - via steam you can refund it in less than 2 hours.

Again, it's a few % of actual games that have issues and anyone can load up an old game with mods for a VRAM bash. Still people thinking about hardware mitigation WHEN 99% OF THE TIME IT'S THE FREAKIN GAME, RELEASED IN A SHEET STATE. 99% of ALL games run fine @ 4k.

People keep using release examples of games to highlight a hardware issue. No games are released in an optimized state - people happy to pay MSRP for games though - but people jump up and down having to spend £1k+ on a card to play it due to VRAM......thinking they are getting a game that needs it - well I dont see a game with Crysis levels of graphical wowness over the rest in the 0.1% of games that apparently require stacks of VRAM - if I did then I could understand it. Dogwarts another example of a poorly optimized game on an established engine - Unreal4. 50fps on a 4090.

Some of the folk clapping and whooping for high VRAM usage because the game needs it - are the same turkeys entitled to cheap high end GPU's but wont buy one.


People calling for 'don't buy GPU's as we are getting ripped off', but happy to pay MSRP to AAA shareholders for absolute dribble. Just another move towards where game access will be annual/monthly payments.

NEVER buy a game on release from AAA's for MSRP - send the message. Wait until 1/2 price when they are debugged and optimized. Not many can have played all games so that the only ones they have left to play are new releases.
 
  • Haha
Reactions: TNA
Back
Top Bottom