• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Doubt Jensen cares considering gaming was 7% of their total revenue.

Also those are very early figures. Of course they were going to outsell the 5070/ti when there was thousands more stock. Give it a few months to see if AMD have really clawed back some market share.

Super or Ti’s around a year down the line are almost inevitable nowadays so that’s not really saying anything.
 
Last edited:
Tides are a changing with the 9070 outselling the 5070 massively, nvidia got complacent, Jensen will hate seeing gamers desert green for red. I expect a super refresh with a 5070 16gb
Jensen don’t care, he’s basically just traded 700M in gaming revenue for 5B extra data center revenue by switching production. I expect gaming will be even lower in the next earnings call with even higher data center.
 
Will you say it's a nothing burger in 10 years when RTX and DLSS stops working on current games because future Nvidia architecture doesn't have fixed function Tensor cores???
ure
You are just literally making that up and there is absolutely no indication DLSS or RTX will stop working in 10 years.

PhysX 32-bit getting canned is indeed a nothing burger that has generated the usual exaggerated nerd-rage in a tiny minority of people.
 
Last edited:
You are just literally making that up and there is absolutely no indication DLSS or RTX will stop working in 10 years.

PhysX 32-bit getting canned is indeed a nothing burger that has generated the usual exaggerated nerd-rage in a tiny minority of people.
I'm annoyed about losing PhysX, because even though I don't play it much any more, I've got more than 500 hours in Borderlands 2 and may well want to go back to it at some point. But... yeah. PhysX never really got widespread adoption and support petered out after a couple of years. DLSS and RTX, by contrast, are in most modern AAA titles and have been for years.
 
I'm annoyed about losing PhysX, because even though I don't play it much any more, I've got more than 500 hours in Borderlands 2 and may well want to go back to it at some point. But... yeah. PhysX never really got widespread adoption and support petered out after a couple of years. DLSS and RTX, by contrast, are in most modern AAA titles and have been for years.

I remember when the dedicated Aegia boards were available, I think on PCI? They weren't cheap.

Never really took off like a must have or a requirement, and at that time I had AMD cards anyway (1900XT)
 
You are just literally making that up and there is absolutely no indication DLSS or RTX will stop working in 10 years.

PhysX 32-bit getting canned is indeed a nothing burger that has generated the usual exaggerated nerd-rage in a tiny minority of people.

Nvidia devotee right here. Any loss of features that work perfectly on the previous gen, is a slap in the face for consumers.
 
On one hand I don't mind Nvidia dropping 32bit PhysX from their GPUs. 32bit is and should be a thing of the past from a hardware perspective, not least of which for the security issues and performance limits it has.

What DOES bother me though, is Nvidia didn't bother to add some software emulation or conversion to keep support/open it up for wider use for those older titles, even if it meant patching the PhysX DLLs for each affected game (this may still happen by someone).
 
You are just literally making that up and there is absolutely no indication DLSS or RTX will stop working in 10 years.

And there was an indication that PhysX would stop working in 10 years' time...? Oh wait

Deprecation is the act of notifying your userbase about the future removal of a feature, not just removing it without warning
On one hand I don't mind Nvidia dropping 32bit PhysX from their GPUs. 32bit is and should be a thing of the past from a hardware perspective, not least of which for the security issues and performance limits it has.
What are you even talking about
 
And there was an indication that PhysX would stop working in 10 years' time...? Oh wait

Deprecation is the act of notifying your userbase about the future removal of a feature, not just removing it without warning

What are you even talking about
So because 32-bit PhysX was deprecated (and only 32-bit) we can now start wild speculation that the most wildly popular and best form of upscaling on the market will also be canned? Just... lol...
 
Last edited:
And there was an indication that PhysX would stop working in 10 years' time...? Oh wait

Deprecation is the act of notifying your userbase about the future removal of a feature, not just removing it without warning

What are you even talking about
Since you ask:

Security-wise 32bit doesn't support full ASLR or DEP in memory, leading to potential security flaws. Honestly I'd rather not have my GPU and it's drivers be an avenue for exploitation. Nvidia had quite a few such issues with their drivers 10 or so years ago.

From a performance perspective, the addressable memory and threading limits of 32 bit - which granted aren't likely an issue in old games - but my point here was around 32bit being dead. Moreover, and this is the bit I don't know about as I don't work on GPU design specifically, but there is(was) likely at least instruction set if not hardware overhead present to support PhysX 32 bit on modern GPUs. As another poster pointed out, some folks would be happy not to waste precious die space on that.
 
ASLR and DEP aren't applicable to GPU memory, and your CPU has dedicated silicon for processing 32-bit instructions, so you're already vulnerable via that avenue.

Nvidia could (and should) have provided compatibility for 32-bit PhysX. For a company of their size it's not unreasonable to expect that, and the lack of it is disrespectful and worrying for the future.

It's not like they didn't know they were breaking something here.

Implementing a software layer to convert the 32-bit instructions into 64-bit for compatibility would have been a better option. Them dropping features without warning (and those complacently defending it) is pretty gross, IMO.

Thanks for clarifying your point about memory limits, but as you said, moot point in old games.
 
ASLR and DEP aren't applicable to GPU memory, and your CPU has dedicated silicon for processing 32-bit instructions, so you're already vulnerable via that avenue.

Nvidia could (and should) have provided compatibility for 32-bit PhysX. For a company of their size it's not unreasonable to expect that, and the lack of it is disrespectful and worrying for the future.

It's not like they didn't know they were breaking something here.

Implementing a software layer to convert the 32-bit instructions into 64-bit for compatibility would have been a better option. Them dropping features without warning (and those complacently defending it) is pretty gross, IMO.

Thanks for clarifying your point about memory limits, but as you said, moot point in old games.
PhysX does have a footprint in general RAM, it's not entirely GPU bound. But yes I agree with your general points here. As mentioned I'm not sure what if any actual die space has been saved by removing 32bit support, it would be odd to remove it for literally no reason.

More importantly I fully agree Nvidia should have done something. Software emulation or providing a patch for old games PhysX would be the right thing to do. Heck, at least a more detailed statement of why they've done it is better than the silence we've had.

To be clear I'm not defending Nvidia's move, just trying to find some logic as to why they removed the support for it.
 
Technical Insights into DLSS 4: Multi Frame Generation and Transformer-Based Architectures

Multi Frame Generation, a technique that enables the generation of three additional frames for every traditionally rendered frame. This approach results in a significant improvement in frame rates, effectively achieving a fourfold increase in performance.

Fake marketing lies. Frame gen does not increase performance.
 
Last edited:
Does antibody use a lot and recommend using the Nvidia app? I generally use just the driver and haven't used the app and just wanted to see if it's required or adds anything other than just changing the game setting in the game instead
 
Back
Top Bottom