• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA RTX 50 SERIES - Technical/General Discussion

Genuinely lost all interest in current GPUs lol, perhaps even future. By the time my 4090 is too old to run games well the gaming space will have shifted to a big enough degree that a dedicated GPU probably won't matter any more as AI/ML will be doing all the heavy lifting anyway and we will be "gaming" on any types of machines.

They'll still charge you for all that r&d to enjoy it.

I've gone in the opposite direction, with display technology being so good, it goes hand in hand with my need to get more juice from my GPU. Diminishing returns and a half.

I'm gonna buy an aorus master/waterforce and get a 2nd 12vhpwr installed for the 800W bios should be fun.
 
£3000 for the Founders 6090 and likely £5000 for an AIB model. PC gaming has become a rich mans hobby once again, Hopefully Valve can price their Steam machine decently.
You would hope the Steam Machine should make it even more obvious that nobody needs a 5090 to play and enjoy PC games. You can still put together a solid little gaming machine for not much money as long as you don’t succumb to all the marketing.
 
You would hope the Steam Machine should make it even more obvious that nobody needs a 5090 to play and enjoy PC games. You can still put together a solid little gaming machine for not much money as long as you don’t succumb to all the marketing.

While I want to agree, isn't it also going to be "overpriced" for what it offers?

unless you plan on playing primarily low demanding indie games, and you could run those on a 100 mini pc I expect.

I see the appeal but not for anyone who is in the market for a 5090.
 
Last edited:
While I want to agree, isn't it also going to be "overpriced" for what it offers?

unless you plan on playing primarily low demanding indie games, and you could run those on a 100 mini pc I expect.

I see the appeal but not for anyone who is in the market for a 5090.

Apparently Linus was playing Cyberpunk at "4K" 60FPS with a variety of settings and upscaling and considering it's running on Linux it's likely running closer to the metal than what Windows currently does due to all the bloat.
 
While I want to agree, isn't it also going to be "overpriced" for what it offers?

unless you plan on playing primarily low demanding indie games, and you could run those on a 100 mini pc I expect.

I see the appeal but not for anyone who is in the market for a 5090.
Idk, I'm sick of playing games at 4K 240hz. The idea of 4K 60 using dreadful FSR is incredibly appealing.

I do think the price will be difficult to stomach, not to mention the system will age very poorly. It won't be able to play many AAA games now with 8gb VRAM, and certainly won't in a year or two. Not to mention, multiplayer games with anti-cheat don't work 9 times out of 10. But if it puts a tiny bit more pressure on MS to improve Windows for gaming, I am 110% behind it.
 
Not to mention, multiplayer games with anti-cheat don't work 9 times out of 10.
This is on the game developers though isn't it and only impacts the games that require kernel level anti-cheat?

I'm not overly clued up on this - I have been looking at potentially installing a Linux build on a separate SSD but in terms of anti it does seem that with Linux it'll be reliant on each games anticheat to be developed as a loadable kernel module and be loaded when you start up the game. But the current 'userbase' of Linux doesn't justify game companies spending time and effort to do it.

Hopefully with the Steam Machine it'll make it Linux a more common system for games and we'll start to see games potentially look at making sure their multiplayer games work on Linux.

Both The Finals and Arc Raiders run on Linux!
 
I've limited mine to 600w. Worked better than undervolting to achieve the same max power output of 600w. Amps and temps on connector good with the Matrix bios better again, higher clocks and steady. Only using that power on BF6 for max fps with no frame gen.
Other single player games I undervolt and use MFG.
Id like to think anyone running a 800w bios is monitoring the connector? I know, you'll crack the jokes :cry:
Personally I don't undervolt at all for a while, as every time I thought it's stable something crashed randomly and I just can't tolerate that in my age anymore - stability 100% over all for me. Power limiting is what I actually do these days, it works very well with 80-85% limit most of the time. Though now in winter I've put it back to 100% :) That said, the only thing I care about at the moment is vRAM amount and only because I run AI models 90%+ of the time for various use and not game. Way way more fun this for me than modern AAA, new hobby in a way. ;) it made me think to possibly upgrade from 4090 to 5090 just to get 8GB more as 24GB is just absolutely bare minimum.
 
Genuinely lost all interest in current GPUs lol, perhaps even future. By the time my 4090 is too old to run games well the gaming space will have shifted to a big enough degree that a dedicated GPU probably won't matter any more as AI/ML will be doing all the heavy lifting anyway and we will be "gaming" on any types of machines.
For gaming the same but fire AI (current main hobby of mine) even 5090 sounds not enough vram... :D
 
This is on the game developers though isn't it and only impacts the games that require kernel level anti-cheat?

I'm not overly clued up on this - I have been looking at potentially installing a Linux build on a separate SSD but in terms of anti it does seem that with Linux it'll be reliant on each games anticheat to be developed as a loadable kernel module and be loaded when you start up the game. But the current 'userbase' of Linux doesn't justify game companies spending time and effort to do it.

Hopefully with the Steam Machine it'll make it Linux a more common system for games and we'll start to see games potentially look at making sure their multiplayer games work on Linux.

Both The Finals and Arc Raiders run on Linux!

Yeah, it's just kernel ones.

Apparently the "compatible ones" don't really do much on Linux. I've seen this https://tulach.cc/the-issue-of-anti-cheat-on-linux/ posted whenever this topic comes up, and that seems to suggest the Linux version doesn't really do anything. Apex legends also binned off the linux compatible anti-cheat as well as it was too easy to go around. Don't know how accurate the blog is, but no one seems to dispute the technical side of it whenever it comes up.

Overwatch is linux compatible (usermode anti-cheat) and that's full of cheaters, far more than any game I've played with kernel anti-cheat. Literally had a game the other day where someone outright said they "changed settings" after going from not knowing what day it is, to be amazing after they lost a round. You'll also frequently catch people watching through walls if you watch enough replays. Only thing that saves overwatch is it's team based + all the ablities. Even rage hacking won't save you from a charging Reinhardt or D.va just eating your shots with defense matrix.

So even if you can get it working on Linux, you will have to sell that to the windows players. Have more cheaters so a small percentage of Linux players can play. That's a hard sell.
 
Last edited:
Is it just a case of taking out the 3080 and sticking the 5070 Ti in or is there more to it?
Do you ever update your drivers? If not I would update to the latest version first and then swap out the GPU.

I went from a 4070Ti to a 2070 Super and no issues but the drivers where quite recent and older GPUs should be fine but it would depend if your current drivers support the 50 series GPUs
 
Had to check back in - bought the Asus Prime OC 5070ti 10 days ago to replace my 3080ti FE and zero regrets. Grerat bump in performance and miles less heat/noise, the GPU and chassis intake fans barely spin up now.
 
Do you ever update your drivers? If not I would update to the latest version first and then swap out the GPU.

I went from a 4070Ti to a 2070 Super and no issues but the drivers where quite recent and older GPUs should be fine but it would depend if your current drivers support the 50 series GPUs
Graphics drivers are always updated. BIOS, chipset etc has been a while.
 
You would hope the Steam Machine should make it even more obvious that nobody needs a 5090 to play and enjoy PC games. You can still put together a solid little gaming machine for not much money as long as you don’t succumb to all the marketing.
if you look at reddit its man you need a 9800x3d everything else it crap , followed by countless post of people with problems ,unbalanced systems 5million fans overheating everything with air dead spots as they all work against each other
 
What do you use AI for ? I wouldn't mind dabbling.
Uff, different models for different use. Gemma 3 I use for writing - it behaves like an uncensored buddy I can throw ideas at and it throws back something useful that then I get a framework from and expand on. Unlike commercial models, it has zero stops end behaves very human like (and not assistant like). Then I have models for assistance that I can feed private data I don't want corporations to extract for learning etc.
Then there's astrophotography and various AI based tools assisting with processing (denoising etc.) as taking photos of celestial objects in London not that easy otherwise.
Then there's AI tools to process older videos for quality improvement (got some of my old family VHS stuff in sensible quality finally!).
And of course the whole image and video generating, often paired with writing, helps visualise certain things, events. Gemma recognises images and comments then helps define prompts.
In other words LLM, generators and tools. And every few months they get nicely improved. Loads to learn as well, which then is best useful in my it work - understanding how llm are set up from server side of things helps understanding how they work and behave depending on parameters and how to best use them at work later.
 
Last edited:
Back
Top Bottom