• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Yes, RT is now,
Exactly. I mean imagine paying a four figure sum for a top of the range AMD GPU and for it to run like crap even with all that vram.

0tB7S8m.png

Yeah, gotta love that buttery smooth sub-60fps the top end Nvidia card offers too. Worth every penny!

On a serious note, it doesn't bode well for the long term when games like this and Dead Space are already taxing high-end hardware.
 
Yes, RT is now,


Yeah, gotta love that buttery smooth sub-60fps the top end Nvidia card offers too. Worth every penny!

On a serious note, it doesn't bode well for the long term when games like this and Dead Space are already taxing high-end hardware.

Indeed. So much for all that vram. Goes to show, once games start to get more demanding, vram does not save you. You need to upgrade. 3090 is a perfect example of that.

I am hoping Starfield which is the game I am most looking forward to comes out a bit more optimised. Not even sure it has RT anyway which is what seems to cause the performance issues in most games these days.
 
Yes, RT is now,


Yeah, gotta love that buttery smooth sub-60fps the top end Nvidia card offers too. Worth every penny!

On a serious note, it doesn't bode well for the long term when games like this and Dead Space are already taxing high-end hardware.

Games are gonna start being made with upscaling in mind. DLSS is so good, I'd prefer fidelity over resolution. Consoles have been making that trade forever.

If you want 4K native, just turn down the settings and leave max settings to those willing to use DLSS.
 
Indeed. So much for all that vram. Goes to show, once games start to get more demanding, vram does not save you. You need to upgrade. 3090 is a perfect example of that.

I am hoping Starfield which is the game I am most looking forward to comes out a bit more optimised. Not even sure it has RT anyway which is what seems to cause the performance issues in most games these days.

I think it will as iirc, some of the job specs specific to star field development were wanting developers with ray tracing experience.

Didn't I say this quite a few times over the past 2 years that the majority of titles would have ray tracing going forward :p ;) :D As great as it is to see developers embracing ray tracing now, it is a shame that these last couple of titles with RT have been very poor implementations either from a IQ or/and performance POV but at the same time, it is to be expected given it is new tech and requires a very different approach to what game developers have become accustomed to over the years, with time and more experience/familiarity, the implementations will get better, that and they also need to start dropping raster if they really want to get the best from ray tracing ala metro ee. Hopefully avatar will be a good showcase.....
 
sigh, so now hogwarts legacy is setting the stage for even 12GB vram not being enough and the 4070ti is dead on arrival.

What gpu am I suppose to buy now then? Have a gtx 1660 and play at 1440p.

Edit: is it my imagination or are hardware review channels trying really hard to push for 4k gaming
 
Last edited:
Yup looks like it, gameGPU shows it off nicely

You can see below it's vram heavy and then it becomes ram heavy where the GPU doesn't have enough vram. Based on YouTube comments, if you don't have enough vram or ram the game crashes to desktop after playing for a while

At least the runs well on Intel and amd cpu, no real advantage to one side







Ok, but that is a stupid amount of ram required. I mean 27 gb of system ram? Lmao
 
sigh, so now hogwarts legacy is setting the stage for even 12GB vram not being enough and the 4070ti is dead on arrival.

What gpu am I suppose to buy now then? Have a gtx 1660 and play at 1440p.

Edit: is it my imagination or are hardware review channels trying really hard to push for 4k gaming
The 4070tis problem in hogwarts is not vram. Like...at all. The game with rt at 4k native is heavy. No card, including a 4090 can run it. You are going to need to drop settings regardless of vram. A 3080 10gb is performing better than a 7900xtx with 24. Yes, the game uses a lot of vram, but even if the 4070ti had 2 terrabytes of it it would still struggle in 4k native + RT. No card can achieve 4k native with ray tracing, not even a 4090. You need to use DLSS or drop down resolution and settings.

TLDR : 12gb still fine.
 
Last edited:
Now I'm seeing "open box" Asus 4090's for $1590.

Unfortunately with all the reports of coil whine, I automatically assume these open box units were returned because of coil whine, but still, things are moving in the right direction.
 
You are talking about frametime here.
Total input lag isn’t equal to frametime.
Frametime is the time between frames and input lag it the time between mouse/keyboard/controller input and what’s displayed on the screen.
Yes but they are related. If there is 60ms of input lag (according to both nvidia's own benchmarks and the user I was responding to), then at 120hz / 120fps that means it takes more than 7 full frames on the screen before it updates and reflect the movement/control input, and similarly when you stop moving the control it will take 7 full frames before the game responds and stops moving.
Having already tested setups with 30-50ms on input lag I know what this feels like and it's not something I would consider "playable".
Yes the image will look smoother, but 60ms of input lag feels the same as playing at low refresh to me.
 
Last edited:
Yup like I said a few times now, the ones that we all should be blaming is ultimately game devs. a 4090 should not be the minimum required to get good performance in newer titles.... 4090 is a fantastic gpu, only gpu were you could say it is worth the price but that still doesn't change the fact that games such as this shouldn't run like **** on anything but a 4090/4080.
1080p @ 30fps for me it would be fine as long as the visuals keep up with the hardware demand and scale down ok on weaker hardware/lower settings.

These vRAM issues are similar to tesselation of flat surfaces and invisible water in Crysis 2.
 
These vRAM issues are similar to tesselation of flat surfaces and invisible water in Crysis 2.



tzMhxNl.gif

Not similar at all, the tessellation mess was Nvidia and their Gimpworks software trying to cripple the competitions cards (they also crippled their own 900 series too doing that then, just to make people on these cards update too).. You know typical Nvidia games... now we have AMD also playing silly games too, they are both as bad as each other really.

VRAM has been stagnated for too long by both AMD and Nvidia and only recently 6000 series AMD added more VRAM than Nvidia as a way to sell their cards and make it look like they had higher specs, which they did for VRAM and made the 3000 series look silly with the VRAM on any card lower than a 3090/ti.. the 6800xt had 16GB ... 3080 10GB/12GB and 3080ti 12GB... So they could play the game Nvidia plays and cripple cards with less than 12GB+ on their AMD sponsored games as we saw .. Reality is 2020 should have been the year all 3080/6800 class cards had a standard of 16GB VRAM.. but.. we saw what happened and Nvidia was not willing to do that after making their 3080 on their larger dies for 3090s and Titans .. reality is the 3080ti would have been on the larger die not the 3080 10/12GB.. As we have seen with the 4080 this gen. 4080ti will be on the same chip the 4090 is on now and cut down a little more than the 4090 and 16GB VRAM.


The real 3080 chip ended up as 3080ti 16GB for laptops because was not faster than a 6800XT chip. The same chip now used for 4080 16GB desktop and 4090 16GB laptop GPUS.
 
Last edited:
Butter those cheeks guys, it's not just VRAM. the game also loves lots of system RAM




Just a quick edit on this: I noticed the guy has an RTX3080. As we all know 10gb vram is not enough and it's especially not enough in this game - the ram usage might be high because the game is pushing data that would be stored in vram if the 3080 had more vram but instead system ram is used. But never the less of you have enough system ram the game seems to run much smoother than if you have too little.

So be warned if you have a low vram GPU the game will use your system RAM instead and if you don't have enough system RAM well then the game is simply unplayable
Wow that gives FF15 a run for its money, FF15 was able to saturate all video ram and over 40 gigs of system ram when nvidia hair was enabled on it lol.
 
Just one game..... for now..... ;) :p :D

Personally I think when we start seeing more "next gen"/UE 5/5.1 games, even a 4090 will start to look rather mid end tbh, largely why I am trying to hold of for another gen.
This practice has gone on for decades, the game doesnt even look that amazing.

Remember ages ago when a dev enabled tessellation below the ground in a game in an area you cant see which tanked AMD performance?

The problem with DLSS and FSR, is sponsored/AAA dev now only care if they hit reasonable framerate.

I noticed though on the AMD bench vs the 3080, the 3080 even though had higher average framerate, it had more frametime spikes aka stutters, and one big stutter.
 
I think it will as iirc, some of the job specs specific to star field development were wanting developers with ray tracing experience.

Didn't I say this quite a few times over the past 2 years that the majority of titles would have ray tracing going forward :p ;) :D As great as it is to see developers embracing ray tracing now, it is a shame that these last couple of titles with RT have been very poor implementations either from a IQ or/and performance POV but at the same time, it is to be expected given it is new tech and requires a very different approach to what game developers have become accustomed to over the years, with time and more experience/familiarity, the implementations will get better, that and they also need to start dropping raster if they really want to get the best from ray tracing ala metro ee. Hopefully avatar will be a good showcase.....
The majority still dont, AAA titles are just a small fraction of the market.
 
Butter those cheeks guys, it's not just VRAM. the game also loves lots of system RAM




Just a quick edit on this: I noticed the guy has an RTX3080. As we all know 10gb vram is not enough and it's especially not enough in this game - the ram usage might be high because the game is pushing data that would be stored in vram if the 3080 had more vram but instead system ram is used. But never the less of you have enough system ram the game seems to run much smoother than if you have too little.

So be warned if you have a low vram GPU the game will use your system RAM instead and if you don't have enough system RAM well then the game is simply unplayable

It's a game that loves all the memory you can give it, after a short while it's using 12GB to 14GB VRAM on my 4090 and about 12GB system RAM. With Chrome open in a few tabs, a few basic programs running and whatever Windows 11 is caching in the background, with this game running my system RAM usage is at 29GB and VRAM is at 17GB usage in total.

This is why I went 64GB RAM about 3yrs ago plus it was cheap at the time so why the hell not, at least it's something I never have to worry about and it's certainly paying off now. 32GB for a high end gaming system is really the amount anyone in 2023 should be using anyway if you like modern games. Sounds like a lot for one game and it is, but you must account for the crap running in the background including Windows.
 
sigh, so now hogwarts legacy is setting the stage for even 12GB vram not being enough and the 4070ti is dead on arrival.

What gpu am I suppose to buy now then? Have a gtx 1660 and play at 1440p.

Edit: is it my imagination or are hardware review channels trying really hard to push for 4k gaming

Isn't it obvious..... buy a more expensive amd gpu for more vram so you get a better experience:

soj1W0f.png

Oh wait....nvm.

This practice has gone on for decades, the game doesnt even look that amazing.

Remember ages ago when a dev enabled tessellation below the ground in a game in an area you cant see which tanked AMD performance?

The problem with DLSS and FSR, is sponsored/AAA dev now only care if they hit reasonable framerate.

I noticed though on the AMD bench vs the 3080, the 3080 even though had higher average framerate, it had more frametime spikes aka stutters, and one big stutter.

Difference with crysis and witcher 3, that was 100% intentional gimping. Hogwarts is clearly just lazy optimisation for pc, nothing more.

And whilst that may be the case, the 3080 footage looks far smoother, no surprise given it's fps is high overall where as 6800xt fps is dropping to 5 fps most of the time :o

The majority still dont, AAA titles are just a small fraction of the market.

True in some regards but they are far more successful than indie titles and is also what most of us play on here and what gets all the attention, that still doesn't change the fact that so far I think every major title released so far this year has got RT in some form and even arguably a good chunk of smaller titles have also got it, like I said many times before, ray tracing isn't going away, it is only gaining more momentum as time goes on, at least RT is still an on/off setting.

I've said it before too, I would like to see more sites benchmarking indie titles as in my experience and based on friends who play the same titles, we have always found nvidia are better than amd here in terms of a smoother and more bug free experience, probably because of amds driver team being so small so they have to put more time/effort on titles which get the spotlight e.g. cod
 
  • Haha
Reactions: TNA
With the whole DLSS input lag thing: I take it with v3 input lag has increased. How does DLSS 2/3 support work, is it just on a game by game basis or can you choose which version of DLSS you want?

I'd rather not have DLSS 3 if the input lag gets too high, I'll be testing it but most likely not use it. Those that remember old school CRT gaming vs those that never have might also be a factor.
 
Thats what I meant, a combination of lazy optimisation and deliberate gimping, its happened for years.

On the larger titles we dont know how many have had Nvidia or AMD intervention, they both fighting for customers, e.g. I think the dev's of Witcher and Cyberpunk are Nvidia influenced. FF15 was Nvidia influenced as well. The pattern I notice when a vendor features are implemented is by coincidence the game tends to be heavy on the hardware, which coincidentally is beneficial for the company selling GPU's, reviewers have been asleep at the wheel not commenting on how cyberpunk has gradually slowed down on PC with patches been released.

Lazy optimisation also happens on the space requirements as well, duplicate copies of textures in a game isnt uncommon now, or making people download a multi gig patch for a few minor changes.

My prediction for DLSS3 is it will arrive for Ampere towards the end of the 4000 series life.
 
Yes but they are related. If there is 60ms of input lag (according to both nvidia's own benchmarks and the user I was responding to), then at 120hz / 120fps that means it takes more than 7 full frames on the screen before it updates and reflect the movement/control input, and similarly when you stop moving the control it will take 7 full frames before the game responds and stops moving.
Having already tested setups with 30-50ms on input lag I know what this feels like and it's not something I would consider "playable".
Yes the image will look smoother, but 60ms of input lag feels the same as playing at low refresh to me.
But that is what im trying to explain to you. YES, the game will feel laggy in terms of input but not because of the FG but because your actual framerate is 40 fps. So even with fg off, the input lag would still be horrible. FG doesn't create the input lag problem, it just doesnt fix it. Arguably, it even makes it a bit worse, but still thats because your original input lag without fg is horrible
 
Back
Top Bottom