• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Usage isn't the same as required, people showed the same with a 3090 using 20gb+ in some games, yet I was still able to play those same games on a 3080 10gb at like 10-20% less FPS.

Even monster hunter wilds still runs on a 3080, even at 4k, yes it's less than half the FPS of a 5090 but its obviously not requiring 30gb to run.
Aye, I remember the PC version of Final Fantasy XV would basically allocate all of the VRAM it could find.
 
With the missing rips on cards what's the chances that those dies were supposed to be used for different skus as in for the model down ti or super cards?
 
Aye, I remember the PC version of Final Fantasy XV would basically allocate all of the VRAM it could find.

Yep. I remember that. People were playing it fine on 8GB cards. Fine on 1080ti with its 11GB and I had a Titan with 12GB and it used all the vram on that too at the time.

Just because a game allocates more vram or even uses from what I can see does not mean it won't run fine on cards with lower vram.

That's why I don't pay attention to all the people going on about vram. Until I see it with my own eyes where it makes a difference couldn't care less.

Right now people are upset we are only getting 16GB. But that won't be an issue until next gen consoles come out imo.
 
Hope and optimism is all I have left :p . My 1070ti desperatly needs an upgrade, mostly play esports/indie/aa games, so it's been fine, but now is the time as there are enough AAA built up I want to play and games that need RT to even run (indiana jones, doom the dark ages and I suspect metal gear delta), but I don't want to feel like I've been bent over.

I'm open to AMD as well, but the RDNA4 thread doesn't seem that optimistic either :(.
I remember you from our days playing Bad company 2 together...anyways is the Intel offering not viable or are you looking for something with a bit more than what the intel can offer?
 
With the missing rips on cards what's the chances that those dies were supposed to be used for different skus as in for the model down ti or super cards?
Saw someone hypothesising that its caused by which specific cores are disabled when cutting it down from a larger chip. So they are cut down by the same % just in some cases it blocked off more rops than in others
 
Last edited:
Yep. I remember that. People were playing it fine on 8GB cards. Fine on 1080ti with its 11GB and I had a Titan with 12GB and it used all the vram on that too at the time.

Just because a game allocates more vram or even uses from what I can see does not mean it won't run fine on cards with lower vram.

That's why I don't pay attention to all the people going on about vram. Until I see it with my own eyes where it makes a difference couldn't care less.

Right now people are upset we are only getting 16GB. But that won't be an issue until next gen consoles come out imo.
Yeah, VRAM conversations are super annoying to listen to.
 
  • Haha
Reactions: TNA
Saw someone hypothesising that its caused by which specific cores are disabled when cutting it down from a larger chip. So they are cut down by the same % just in some cases it blocked off more rops than in others
That doesn't explain the 5080 missing ROPs, it uses the full die.
 
That doesn't explain the 5080 missing ROPs, it uses the full die.
Which is why I suspect the missing rops to be for a 5070 super or something.
Time will tell, I'm sure they will be stockpiling these missing rop units and release something with that actual spec later down the line
 
Usage isn't the same as required, people showed the same with a 3090 using 20gb+ in some games, yet I was still able to play those same games on a 3080 10gb at like 10-20% less FPS.

Even monster hunter wilds still runs on a 3080, even at 4k, yes it's less than half the FPS of a 5090 but its obviously not requiring 30gb to run.
Yea it could be playable but wasn't it shown that textures were not being loaded in properly or not at all as a result?
 
Yeah the 5090 FE cooler is really small, I would worry about temps in my closed case, especially in Summer when my room gets really warm.

Have you tried setting a custom fan profile using a tool like MSI Afterburner? It may help to avoid those fan spikes as you can keep the fan running at a minimum level to avoid those big dips.

You'd need far, far more metal to keep a heat generating component under 50c.

It's not so much the amount of metal because eventually it will reach equilibrium, it's probably being caused by not having enough passive airflow like a nearby case fan gently blowing air in its direction. You probably wouldn't need much in the way of airflow directed at the GPU to keep it under 50c when idle/doing light tasks, something like a 120mm spinning at 800rpm nearby would probably be enough to keep it under 50c in light workloads/idle.
I game down stairs on my 77” OLED so I never hear it under load and I watched it draw around 525watts and it keeps under 75c, most times its around 66c, not bad, but idling and then I am using my monitor upstairs close to the PC and that’s when the ramping is happening as monitoring I can see using the web browser there are spikes to full core power now and again which makes the heat build-up.

My case as 6 120mm intake and 2x180mm and 1x120mm outtake, zero issues with the 4090 FE, a thread is open on the Nvidia forum users moaning about it asking for a lower than 30% speed to keep it at to stop the ramping.
 
Yea it could be playable but wasn't it shown that textures were not being loaded in properly or not at all as a result?
I've not experienced that with any games on my 3080, what game are you referring to in particular?
Usually vram issues result in severe frame rate drops, to like single digits FPS along with texture pop in, so not playable either.
 
Last edited:
I've not experienced that with any games on my 3080, what game are you referring to in particular?
Usually vram issues result in severe frame rate drops, to like single digits FPS along with texture pop in, so not playable either.
Steel Rising had texture streaming issues on my 3080 but not on the 5080 I had briefly. That game is one of the worst optimised games I've ever played though. Lots of people on this board seem to enjoy paying over the odds for GPUs to brute-force their way through horrible games though. A bit like when people were trying to justify the 3090 cost purely to use the Far Cry 6 texture pack.
 
I remember you from our days playing Bad company 2 together...anyways is the Intel offering not viable or are you looking for something with a bit more than what the intel can offer?

Hello, bad company 2, good times, that seems like a lifetime ago now :p

Intel is out for a couple of reasons.

1) Cpu overhead. I have an 11400 with power limits off and gear 1 3600, in that config techpowerup review puts it about 10% behind a 5600x , so not terrible, but not great and so cpu overhead is a concern. Even at 1440 where it's not so bad, it's still a concern for me, as those tests are always at max settings. I'm not always on max, playing something like Overwatch, I'm on low as that gets rid of visual clutter (there are bushes and tress and other things that are just not rendered on low model detail), which tips it back onto the cpu, I'm also not on 1440 yet (monitor and cpu are next on the chopping block once I upgrade, can't afford to do all at once, so gpu first and enjoy whatever uplift I can get while I save for other upgrades).
2) I want a bit more grunt. I keep my stuff for a long time, I just feel I want a bit more power than somewhere between 4060/ti.
3) Outside of multiplayer games, I mostly play AA/indie games. While Intel have been fixing problems, I'm not confident that Intel will move at a decent pace to fix random indie game that 1000 people are playing vs whatever the next AAA title is. AMD or Nvidia are a safer bet there.

RDNA thread seems a little bit more postive now, assuming independent stuff holds up to AMDs claims, a 9070 seems like it could be for me once the price drops. No way is it staying at only $50 difference between it and the XT, no one will buy. I'm patient and can wait or maybe the 5060ti will suprise me and Nvidia will be agressive with the pricing :p.
 
Yea it could be playable but wasn't it shown that textures were not being loaded in properly or not at all as a result?
Is there a limit to vram examples required from forum users/tech sites/articles about Ampere and Ada vram issues, devs have even released statements explaining fallbacks introduced for gpus with not enough vram?

The irony is-the ever ending excuses/acceptance=Nv keeps increasing the price for vram to the point that the current 16Gb buy in is about £900 give or take, or it's £2000+ for more than 16Gb, but it's plenty right?
 
I been finding my 5090 FE pretty good, but one annoying thing about it is that when idle and you are web browsering etc.., basically not gaming, if the room is warm as the sun been hitting it for a few days and the fans keep ramping up to cool the card down then switches them off, then around 10mins or so, its repeat it all again.

The cooler is good as long as it’s got the fans spinning, but as soon as they stop there is not enough metal to dump the heat into before it hits over 50c for the fans to ramp up.

So far, the 5090 is good, but the drivers are trash, I am getting micro stutters on my 77” Samsung 95D that I play on downstairs, my 4090 had none at all, opened a thread on Nvidia forum as there are many with the same problem, so trying to get this fixed.

I would say overall the 4090 is more solid, the 5090 just feels a bit rough at the moment, but I know the 4090 was like that a little, just not as bad as this what I remember.
Thanks for sharing. This is why hearing real world experiences is important as this sort of stuff rarely gets mentioned in reviews. It is such as shame they were hellbent on making the card two slot. Even just making it 3 slot (4090 size) would make a massive difference.
I keep toying with the idea of a sim rig but I've got so many frigging expensive hobbies, I really don't need another. I'd need to build a whole second gaming PC also as my main system is in the livingroom.
Maybe one to avoid then. You'll get a 'basic' wheel and pedal setup, then get the buzz and want to upgrade. Then you may fall out of love with it. That's kind of what has happened to me, haha. I do like it, but you have to put so much time in to remain competitive online (ACC, iRacing, etc).
 
Nvidia hotfix driver to fix crashing and black screens on RTX5000 GPUs has actually made the situation worse and now users report all games crash when multi frame generation is enabled


Ironically I've had this black screen crashing issue on my 4090 for the last year, it's infrequent for me though and only sometimes occurs when switching which screen is my main display, but still it highlights that Nvidia has chronic driver stability issues
 
Last edited:
Hello, bad company 2, good times, that seems like a lifetime ago now :p

Intel is out for a couple of reasons.

1) Cpu overhead. I have an 11400 with power limits off and gear 1 3600, in that config techpowerup review puts it about 10% behind a 5600x , so not terrible, but not great and so cpu overhead is a concern. Even at 1440 where it's not so bad, it's still a concern for me, as those tests are always at max settings. I'm not always on max, playing something like Overwatch, I'm on low as that gets rid of visual clutter (there are bushes and tress and other things that are just not rendered on low model detail), which tips it back onto the cpu, I'm also not on 1440 yet (monitor and cpu are next on the chopping block once I upgrade, can't afford to do all at once, so gpu first and enjoy whatever uplift I can get while I save for other upgrades).
2) I want a bit more grunt. I keep my stuff for a long time, I just feel I want a bit more power than somewhere between 4060/ti.
3) Outside of multiplayer games, I mostly play AA/indie games. While Intel have been fixing problems, I'm not confident that Intel will move at a decent pace to fix random indie game that 1000 people are playing vs whatever the next AAA title is. AMD or Nvidia are a safer bet there.

RDNA thread seems a little bit more postive now, assuming independent stuff holds up to AMDs claims, a 9070 seems like it could be for me once the price drops. No way is it staying at only $50 difference between it and the XT, no one will buy. I'm patient and can wait or maybe the 5060ti will suprise me and Nvidia will be agressive with the pricing :p.
A 5060 ti is shaping up to be a dead duck already given the product stack not to mention super expensive. That's not to say it wouldn't be an upgrade over what you have now but there are plenty of options that fit your budget. Once the 5 series becomes more available (although a while off) the prices of the 4 series should drop so say something like beween 4070, 4070ti a 4080 if they come up cheap. Of course if the 9070 or 9070xt can meet it price point and isn't heavily scalped by retailers or private individuals I can't see why that would be bad for you in any shape or form. If you have to have Nvidia then wait for a cheap 4070. You get most of the feature set of the 5 series but without the fires or missing rops :P
 
Last edited:
Back
Top Bottom