• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

Software currently being worked on that can generate 4 frames for 1 real frame. I'll never used it though, current Nvidia and amd frame gen is not appealing to me - yes there is the latency, but the latency is not too bad if you're only generating one fake frame for each real frame - the real issue for me is the smearing/blurring that frame Gen creates

Being a gamer in 2024 consists of: TAA blurring, DLAA smearing, Frame Gen smearing, DLSS/FSR low internal resolution, tons of post processing junk like film grain and then the cherry on top is developers using PS3 quality textures and at the same time demanding you have 16gb vram

Combine all of these "features" together and you get a modern AAAA $100 game

Wrong!

Combine them all and you have Grim panting to throw money at Jensen to buy a 5090 :p:cry:
 
I know you are not very serious about it but enough of people here are and it seemed like good enough time to give my thought about such "improvements". :) In other words, not interested. And considering AI is nr 1 for Nvidia these days, I suspect they will show more AI stuff.
The RTX4090 is significantly faster than anything else and is by far the best performing gaming GPU ever released. For those with deep pockets who play at 4k, or especially those who play VR, it has been a great buy. The type of people who buy an RTX4090 are anyway generally at the bleeding edge and so of course are statistically far more likely to upgrade each generation to the latest and greatest to get more performance. This is nothing new and has been happening every generation since the first performance GPU's were invented (I should know, I was there).

However, what has changed in the last couple generations since the 2080Ti is that the price of entry to the bleeding edge has increased disproportionally (RTX3090 and RTX4090), so that for many people it's no longer possible to comfortably afford it. Or, the 'good value' GPU's were just straight-up unavailable (RTX3080). To switch to an RTX4090 often needed people to upgrade not only their PSU, but also their case just to fit it in, which lets face it sucks. This spiralling barrier of entry to the high-end is, understandably and justifiably, where many if not most people's bitterness springs from.

I will be getting an RTX5090 only for VR as it has basically changed my gaming life and it can never have enough juice. Once you have experienced this, it's hard to go back... :D

 
Last edited:
Therefore Blackwell is 27% more efficient than Ampere. So the 5080 must be a significant larger GPU than the 4080 - it must have significantly more cores

You can't make that leap yet though because the reliability of the information is too weak, in what scenario is a 5080 4090+10% what is being measured?

I agree that 5000 series needs to be significantly larger sku-4-sku to provide meaningful gains we just haven't seen evidence of that nor does it preclude Nvidia trying to bs the 5000 series performance with marking fluff or non-comparable feature. Personally I think we are going to see the latter this gen because Nvidia really doesn't like giving larger die's to consumers and we haven't got a major node change this time around allegedly... of course I know no more than anyone else it just seems to be heading that way
 
The type of people who buy an RTX4090 are anyway generally at the bleeding edge and so of course are statistically far more likely to upgrade each generation to the latest and greatest to get more performance.
Not seeing any reason to upgrade yet, from 4090. If I get too little FPS (below 90), I switch from DLAA to DLSS. If it'll be a bigger problem in the future, just small push of some of the quality sliders in games to the left, as more often than not change in visual quality is negligible and suddenly FPS jumps up considerably, as I've noticed in new games.

This is nothing new and has been happening every generation since the first performance GPU's were invented (I should know, I was there).
I'm old enough to have used 8bit computers with passion back in the days. :) and I've almost always aimed at xx80 class of cards, unless xx70 was already so good it was much smarter to get that. Got this xx90 card once, it was worth it for many reasons but I still see 0 reasons to go for any 5xxx series, as is.

However, what has changed in the last couple generations since the 2080Ti is that the price of entry to the bleeding edge has increased disproportionally (RTX3090 and RTX4090), so that for many people it's no longer possible to comfortably afford it.
Or simply see no reason to blow that much money on gaming - huge majority of gamers (in pure numbers that would be almost all gamers) play on mobile. Consoles have much lower numbers and so do PCs these days. Comparing to mobile, we are a tiny and not very significant market. So they (vendors) can charge anything they want, as this isn't where the real money really is anyway, and they don't seem to care about selling "big" numbers - reporting high profit margins to shareholders looks really good, instead.

Or, the 'good value' GPU's were just straight-up unavailable (RTX3080).
True, been there. Never had to wait for so many months to not get a product I ordered.

To switch to an RTX4090 often needed people to upgrade not only their PSU, but also their case just to fit it in, which lets face it sucks. This spiralling barrier of entry to the high-end is, understandably and justifiably, where many if not most people's bitterness springs from.
I'm my case (pun not intended) it was just GPU - I had everything else already. That helped make the decision, true

I will be getting an RTX5090 only for VR as it has basically changed my gaming life and it can never have enough juice. Once you have experienced this, it's hard to go back... :D

I've been fascinated with vr since I first experienced it as a kid (1st prototypes) but these days, oddly, not so much anymore.
 
You can't make that leap yet though because the reliability of the information is too weak, in what scenario is a 5080 4090+10% what is being measured?

I agree that 5000 series needs to be significantly larger sku-4-sku to provide meaningful gains we just haven't seen evidence of that nor does it preclude Nvidia trying to bs the 5000 series performance with marking fluff or non-comparable feature. Personally I think we are going to see the latter this gen because Nvidia really doesn't like giving larger die's to consumers and we haven't got a major node change this time around allegedly... of course I know no more than anyone else it just seems to be heading that way
It'll likely go like with new Ryzen - much faster in niche use, only a bit faster in general gaming, more transistors, less power use per same operation but more transistors means more power use to get proper speed of the chip. And obligatory high prices.
 
Not seeing any reason to upgrade yet, from 4090. If I get too little FPS (below 90), I switch from DLAA to DLSS. If it'll be a bigger problem in the future, just small push of some of the quality sliders in games to the left, as more often than not change in visual quality is negligible and suddenly FPS jumps up considerably, as I've noticed in new games.


I'm old enough to have used 8bit computers with passion back in the days. :) and I've almost always aimed at xx80 class of cards, unless xx70 was already so good it was much smarter to get that. Got this xx90 card once, it was worth it for many reasons but I still see 0 reasons to go for any 5xxx series, as is.


Or simply see no reason to blow that much money on gaming - huge majority of gamers (in pure numbers that would be almost all gamers) play on mobile. Consoles have much lower numbers and so do PCs these days. Comparing to mobile, we are a tiny and not very significant market. So they (vendors) can charge anything they want, as this isn't where the real money really is anyway, and they don't seem to care about selling "big" numbers - reporting high profit margins to shareholders looks really good, instead.


True, been there. Never had to wait for so many months to not get a product I ordered.


I'm my case (pun not intended) it was just GPU - I had everything else already. That helped make the decision, true


I've been fascinated with vr since I first experienced it as a kid (1st prototypes) but these days, oddly, not so much anymore.
To summarize your post... I would say if you are not into VR then anything beyond the RTX4090 likely isn't needed for 4k or below anyway and as you say most games push 90-120hz (with or without DLSS) so you could likely skip a generation and be happy.

Out of interest, did you try VR in the last few years?
 
Last edited:
If the 600W thing is true, I think it's a really stupid idea. At that level, a 1000W PSU is likely not going to be enough for many system configurations, with 1200W probably becoming the minimum. 1000W PSUs are very popular, having long been in the "fine for anything but the most exotic setups" camp. This would, for many people, mean adding the cost of a new PSU to the cost of the GPU. It also - and this may not be an issue for the most technically confident, but it will in the wider market - means that rather than doing a simple GPU swap when upgrading, you need to do a PSU swap with all of the frankly-painful recabling that entails.
 
I'm loving my 4090 now I have vr . The only thing I don't like is the controls for games beat saber excellent skyrim needs keyboard and mouse .

What? No, you don't need a keyboard and mouse at all for Skyrim VR. Are you just playing the stock version? Get one of the mod packs. They completely transform the game!!
 
To summarize your post... I would say if you are not into VR then anything beyond the RTX4090 likely isn't needed for 4k or below anyway and as you say most games push 90-120hz (with or without DLSS) so you could likely skip a generation and be happy.

Out of interest, did you try VR in the last few years?
I have Quest 2 currently and used that with pc as well. It was fun now and then but I'm mostly into rts games and the likes so not useful there. If you are into simulation for example then that's much bigger of a thing I'm sure. :)
 
If the 600W thing is true, I think it's a really stupid idea. At that level, a 1000W PSU is likely not going to be enough for many system configurations, with 1200W probably becoming the minimum. 1000W PSUs are very popular, having long been in the "fine for anything but the most exotic setups" camp. This would, for many people, mean adding the cost of a new PSU to the cost of the GPU. It also - and this may not be an issue for the most technically confident, but it will in the wider market - means that rather than doing a simple GPU swap when upgrading, you need to do a PSU swap with all of the frankly-painful recabling that entails.
All is that would be acceptable for high end user (many just buy whole new pc then so not a big problem). But the generated heat is a deal breaker for me - most is the UK has no air conditioning in flats and houses. That's an oven situation in summer.
 
If the 600W thing is true, I think it's a really stupid idea. At that level, a 1000W PSU is likely not going to be enough for many system configurations, with 1200W probably becoming the minimum. 1000W PSUs are very popular, having long been in the "fine for anything but the most exotic setups" camp. This would, for many people, mean adding the cost of a new PSU to the cost of the GPU. It also - and this may not be an issue for the most technically confident, but it will in the wider market - means that rather than doing a simple GPU swap when upgrading, you need to do a PSU swap with all of the frankly-painful recabling that entails.
I can’t imagine someone upgrading from a 4090 to a 5090 would be overly concerned about the extra cost for a new PSU and fitting it.
 
I don't consider the extra power "needs" of the supposed leaked info that important, just like the 4090, the 5090 is going to be running typically in games at much lower draws anyway, and once an undervolt is applied, that will be around 150watts less maybe more? My 4090 draws less power than the 3080 Ti FE did typically.
 
I don't consider the extra power "needs" of the supposed leaked info that important, just like the 4090, the 5090 is going to be running typically in games at much lower draws anyway, and once an undervolt is applied, that will be around 150watts less maybe more? My 4090 draws less power than the 3080 Ti FE did typically.

Yup I never care about this either, was the same for ampere too, I ended up undervolting mine to .825 @ 1830 and basically get roughly similar perf but 100+w less. Also, factor in things like upscaling and frame gen and this will also reduce power consumption.
 
Back
Top Bottom