• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia RTX 6000 series (codename Rubin)

Like it or not, the way these companies will get round the slowing of Moore's law, is with innovation not coming so rapidly in the hardware space but in the AI space.

The 6000 series will just do more AI stuff with our games.

Personally I can not wait for the 6080 as I would like an upgrade.

You're probably right, it's will be very interesting to see what happens when they let A.I take a crack at the overall design of a new architecture or imprint over existing designs.
 
Timeline update for a delay unfortunately. When I made this thread in January 2025 the belief was that Rubin server would launch very early 2026 and then based on the 6 month Nvidia standard cadence between server and desktop, desktop RTX6000 would launch late 2026.

However Jensen now says Rubin server enters mass production this time next year - meaning late 2026. So now we're looking at desktop RTX6000 cards only launching mid to late 2027, meaning we're still 18 to 22 months away from new Nvidia GPUs

Kind of annoying for me as I wanted to build a new PC next year and now I have to wait for 2027

I dont think Rubin will have a delay timeline, I am very impressed Jansen managed to revealed massive Rubin R100 or R200 chips for first time months earlier rather than waited for GTC 2026 in March 2026 to reveal the chips for first time and Rubin server chips will enter mass production in Q3 or Q4 2026. Rubin chips Jansen revealed just came to the labs from TSMC was actually the prototype, it was manufactured on week 38 so that was 15 September last month.

Let looked back on Blackwell timeline in 2024, Jansen revealed B200 chip for the first time on 18 March 2024 but unfortunately the camera guy didnt managed to get clear shots of chip to see manufactured date. So like he did with Rubin chip, GB200 was a prototype chip likely manufactured in February 2024. Then fast forward to July 2024, somebody managed to leaked photo of RTX 5090 GB202 chip prototype for the first time with manufactured date 2024 week 29, that was 15 July 2024. It took Nvidia over 3 months from taped out GB202 prototype chip to mass production chip on week 46 so that was 28 October 2024. B100 and B200 server chips was probably in mass production months earlier before launched in November 2024 and Nvidia decided to moved Blackwell desktop GPUs launch from November 2024 to January 2025 at CES. If Nvidia not decided to moved desktop GPUs launch date to January 2025 then Nvidia would launched whole Blackwell server and desktop GPUs in November 2024.

I think Rubin will follow same Blackwell timeline, Nvidia should have first RTX 6090 prototype tapeout about 5 months after Rubin server prototype chips around in February 2026 and production chip ready for mass production around in June 2026. Nvidia could launch Rubin desktop GPUs in summer 2026 or decide to move launch date to November 2026 with server chips or launch at CES 2027 exactly 2 years after RTX 50 series launch event.

But launch in mid to late 2027 is far too long time away, both Microsoft and Sony confirmed next generation Xbox and Playstation 6 will launch in late 2027. AMD and Nvidia defintely will have launch next generation GPUs before mid 2027 either mid 2026, late 2026 or CES 2027 first before next generation consoles.
 
But launch in mid to late 2027 is far too long time away, both Microsoft and Sony confirmed next generation Xbox and Playstation 6 will launch in late 2027. AMD and Nvidia defintely will have launch next generation GPUs before mid 2027 either mid 2026, late 2026 or CES 2027 first before next generation consoles.

2026 for 6000 wouldn't make sense IMO as Supers aren't launching until Q2 2026, I'd peg it as Late jan early feb 2027, It will be interesting seeing what AMD do though.
 
Like it or not, the way these companies will get round the slowing of Moore's law, is with innovation not coming so rapidly in the hardware space but in the AI space.

The 6000 series will just do more AI stuff with our games.

Personally I can not wait for the 6080 as I would like an upgrade.
You mean the AI deepfake faces? Or fake voices? None of that is required nor desired in most games and mainstream GPUs can't even handle it, so it would be a gimmick reserved for chosen few that people will mostly hate with passion, I suspect. AI can't and doesn't create anything, it can only interpolate and it can't not extrapolate. Ergo, it can rehash already existing things, fill in some blanks etc. but it can't create anything new that it's not seen before. Current models work great in DLSS and as a chatbot pretending to be someone it got trained on, but that's mostly where it ends. Plus the whole shenanigans with hallucinations that they've still not solved. And if you get an AI generated graphics, it will never be the same twice, as these models generate different output on same input each times. I just don't see it - it's marketing rubbish in my eyes, hardly anything new useful sans possible gimmicks.
 
You mean the AI deepfake faces? Or fake voices? None of that is required nor desired in most games and mainstream GPUs can't even handle it, so it would be a gimmick reserved for chosen few that people will mostly hate with passion, I suspect. AI can't and doesn't create anything, it can only interpolate and it can't not extrapolate. Ergo, it can rehash already existing things, fill in some blanks etc. but it can't create anything new that it's not seen before. Current models work great in DLSS and as a chatbot pretending to be someone it got trained on, but that's mostly where it ends. Plus the whole shenanigans with hallucinations that they've still not solved. And if you get an AI generated graphics, it will never be the same twice, as these models generate different output on same input each times. I just don't see it - it's marketing rubbish in my eyes, hardly anything new useful sans possible gimmicks.

No I mean AI innovation in general.

It already happened with the 5000 series and MFG 4. We will get new innovations in this space and other AI driven innovations.

Things like ACE (I think it's called that). The technology they demoed of npcs with more natural reactive dialogue etc...

Basically tech we don't currently know about.

It's already happened with the 5000 series. No one wanted MFG but we got it.

People say Nvidia is now an AI company. So they will be given us hardware simply to complement the AI.

Not all of it is a gimmick. Is DLSS a gimmick?

Just came up on my phone randomly. But what's the key word here? AI.
 
Last edited:
No I mean AI innovation in general.

It already happened with the 5000 series and MFG 4. We will get new innovations in this space and other AI driven innovations.

Things like ACE (I think it's called that). The technology they demoed of npcs with more natural reactive dialogue etc...

Basically tech we don't currently know about.

It's already happened with the 5000 series. No one wanted MFG but we got it.

People say Nvidia is now an AI company. So they will be given us hardware simply to complement the AI.

Not all of it is a gimmick. Is DLSS a gimmick?

Just came up on my phone randomly. But what's the key word here? AI.

Nvidia's standard FG is ok but MFG is kinda crappy as it introduces noticeable artifacts and increases latency quite a bit, It's a gimmick used to say "look we have more frames" when in reality it's not really true.

The ACE demo was cringe at best.
 
Last edited:
Nvidia's standard FG is ok but MFG is kinda crappy as it introduces noticeable artifacts and increases latency quite a bit, It's a gimmick used to say "look we have more frames" when in reality it's not really true.

The ACE demo was cringe at best.

Not all Innovations work or take off.

But I guarantee they will be doubling down on it in the future.

Whose to say like they did with DLSS 3 too 4 that we won't get an improved MFG with much better latency?

See what I'm saying.

There's more rapid progress to be made in the AI space that there is in physical hardware.
 
Last edited:
Not all Innovations work or take off.

But I guarantee they will be doubling down on it in the future.

Whose to say like they did with DLSS 3 too 4 that we won't get an improved MFG with much better latency?

See what I'm saying.

There's more rapid progress to be made in the AI space that there is in physical hardware.

Seems the only people creaming over AI are those financially invested in AI or those who always fall for the latest trend.

AI will eventually stop being a buzz and then people like good sheep will move onto the next trend that corporations tell them to like.
 
Last edited:
Seems the only people creaming over AI are those financially invested in AI or those who always fall for the latest trend.

AI will eventually stop being a buzz and then people like good sheep will move onto the next trend that corporations tell them to like.

We are only at the beginning of the AI revolution
 
AI is good for making companies money, for the general public AI is at best a gimmick and it will remain that way.

5000 series already showed us that this is the direction the industry is taking.

You can deny it but it's facts.

Do you expect the 6000 series is going to be less AI focused than the 5000 series?

AI is not a gimmick. It's just in its infancy.

For example:

Nvidia CEO Jensen Huang stated in October 2025 that 100% of Nvidia's software engineers and chip designers now use an AI coding assistant, specifically Cursor. This was a major endorsement of AI's role in engineering and a practical demonstration of his broader vision for the technology.

Have some vision man, we are at the very beginning of this technology.

It's the next wave of innovation.
 
Last edited:
5000 series already showed us that this is the direction the industry is taking.

You can deny it but it's facts.

Do you expect the 6000 series is going to be less AI focused than the 5000 series?

AI is not a gimmick. It's just in its infancy.

You seem overly invested in AI :D

Nvidia are the ones pushing to have AI gimmicks but the general public aren't asking for it nor do most seem interested, Infact you mention AI to most people and they'll roll their eyes.
 
Because they don't understand.

No they do, And for the most part right now it's a money making gimmick.

It'll stop being a gimmick when it actually does something positive for the general populace outside of little showcases here and there, Not just corporations, And that won't be anytime soon.
 
Last edited:
No they do, And for the most part right now it's a money making gimmick.

It'll stop being a gimmick when it actually does something positive for the general populace outside of little showcases here and there, Not just corporations, And that won't be anytime soon.

Let's put it this way.

I saw a stat yesterday saying that 300 billions lines of code are uploaded to GitHub everyday.

And there's a big chance a large proportion is being written by AI agents

So it's already having an impact on the ordinary persons life.

"Nvidia CEO Jensen Huang stated in October 2025 that 100% of Nvidia's software engineers and chip designers now use an AI coding assistant, specifically Cursor. This was a major endorsement of AI's role in engineering and a practical demonstration of his broader vision for the technology".

Now days if you are NOT using AI you are at a disadvantage.

New jobs are starting to appear. Instead of software developer the job is to review code generated by AI agents. Examples of how the industry is changing.

Look at AMD just today I see they have joined up with OpenAI. They are all at it!
 
Last edited:
Let's put it this way.

I saw a stat yesterday saying that 300 billions lines of code are uploaded to GitHub everyday.

And there's a big chance a large proportion is being written by AI agents

So it's already having an impact on the ordinary persons life.

"Nvidia CEO Jensen Huang stated in October 2025 that 100% of Nvidia's software engineers and chip designers now use an AI coding assistant, specifically Cursor. This was a major endorsement of AI's role in engineering and a practical demonstration of his broader vision for the technology".

Now days if you are NOT using AI you are at a disadvantage.

New jobs are starting to appear. Instead of software developer the job is to review code generated by AI agents. Examples of how the industry is changing.

Coming across like an AI bot.... :D

Most of the code written by AI is filled with bugs and the ordinary person has no idea what github is so no clue what hole you pulled that from.

Of course a rich man who sells AI hardware is going to say that, Come on bud use some common sense :cry:

Most people don't work in positions that have anything to do with AI, AI is still very much a niche pushed by marketing types and those who make money from it.

I'm not against AI, I think DLSS Super Resolution is amazing, Base frame gen can make a game very smooth but in general AI right now is a gimmick.
 
Last edited:
No I mean AI innovation in general.
There isn't any, they are stuck with currently existing in practice ideas. Sure, they make them better with time but no actual innovation happens, as they have no clue how to go forth from here, to more general AI. It's mostly just marketing rubbish. I've been talking to few Nobel price winners recently (I work in IT but around scientists) and they seem to be very sceptical of common usability of currently existing AI models, aside what they already are being used for in science (mostly automation and replacing humans in very tedious repetitive tasks) - as again, no way to move it really forth has been presented, yet.
It already happened with the 5000 series and MFG 4.
It's just good old transformer model, known for ages and just nobody wanted to touch it thinking people wouldn't like it and the hardware wasn't fast enough either - Nvidia marketing won here, though. That's not a real innovation, best not to confuse these things. FG in general has been known for a ages too, still has same problems as ever, but gained much more acceptance these days out of necessity. If it wasn't necessary hardly anyone would use it, but it is what it is.
We will get new innovations in this space and other AI driven innovations.
We will get gimmicks for sure, hardly anything useful has been shown so far, aside texture compression (but that's also not a new idea and again comes to life out of necessity).
Things like ACE (I think it's called that). The technology they demoed of npcs with more natural reactive dialogue etc...
And then you will get hallucinating NPCs that will end up with parents flipping the table and games requiring ID (like it's coming to gpt) to be sure they don't go completely off the rails with kids and start flirting with them or persuade them to unalive themselves etc.
Basically tech we don't currently know about.
We would've known about it already from leaks etc. if anything revolutionary existed. Expect nothing, maybe you will be happy then with what comes. Expect something great and you will be very disappointed.
It's already happened with the 5000 series. No one wanted MFG but we got it.
Again, out of necessity because real hardware performance stagnated already. It's a band aid.
People say Nvidia is now an AI company. So they will be given us hardware simply to complement the AI.
AI company as in pushing it in marketing to sell more of their "shovels" to "gold miners". Not a company that actually develops AI models - they just use what's out there.
Not all of it is a gimmick. Is DLSS a gimmick?
Read again what I wrote - dlss is what current models are good in (interpolation) and that's where it ends.
Just came up on my phone randomly. But what's the key word here? AI.
What? :D I suggest you watch this - one of many such videos recently, as people are waking up to what it really is all about. It might open your eyes a bit to how big this marketing corpo-idiocy really is:

I'll just add I am making quite good money on stock market with this AI craze currently (better than I've done with crypto, which I also disliked). I also use AI extensively at work. I know what the limitations are and my finger is ready to pull money out of the market the moment it comes down crashing - as it will eventually and it will be quite a huge explosion I predict, when that happens.
 
Last edited:
Back
Top Bottom