• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

Mate of mine said recently he was talking to someone who works in the industry and this mate of his, Simply by watching patterns in companies and not some hush hush insider or anything, Genuinely believes maybe 1-2 more gens of GPU and then companies will slowly move away from consumer hardware as there's more and more money to be made from AI, Data centres etc... so what's the use in charging £800-£1000 for a xx80 GPU when you can slap a bit more memory on it and sell that same silicon for £8,000 a pop to some big company.

The reason for this is its not going to progress forever as you are envisioning. Like anything else in the hardware scene in industries the specialising will occur so "AI" compute devices will be just that and not actually GPU's anymore. Once industry moves away from the 'general' way the portion of the design that has always been for desktop plug in board will have their own mega-corp data farm units that make best use of space and power.

From the other end, you have integrated graphics which could make more of a headway, however we were promised this a long time ago. We will have to see how this AI fad plays out and settles before you can see how it has shaped the gaming GPU but you are right observing that its going to strip away the high end where that silicon will be reserved for higher paying business - until people slow investment in that space or it forks off to dedicated hardware devices.
 
Exactly.

On a positive note Intel should have their drivers in fairly decent shape by 2035 if they keep working hard on them :D
Well Intel don't seem willing or capable to sort out older games, but if they keep on top of current and new games, then by 2035 that should be plenty.

And CPUs will continue to have some onboard iGPU so for truly old games an AMD (or Nvidia if ARM for desktop ever takes off) iGPU should be enough to run them.
 
Here come the silly guesses. Let me join in.

I think it will cost £4999 :)

No, hear me out. Why would they charge less when they can use the limited sillycone to sell it to the likes of Elon Musk's companies who would pay much more? :p:D
I'm 100% sure that they would think they could get away with it that would have done it in an instant. It's all about how many whales are out there and how many cards Nvidia have to sell then just compare the ratio and there we go - 5090 for whales, the rest for the populous. Since 20th Century populus could afford all these consumer products but before that best stuff was reserved only for the richest. And we are slowly moving back that direction, one noticed.
 
What is you cut off point? £1999? :)
At this price range cut off point is very fluid and likely just psychological which comes down to 1 being lower than 2, hence £1999 seems much lower than £2000 even though it's not (which is why prices often end with .99 etc.). However, we've seen 4090s going much higher than this for a bit more fancy version, so I expect them to reach even £2999 for the most fancy ones, easily.
 
When people are giving guess it should be based on msrp and fe prices. Aftermarket prices can vary a lot.
MSRP has been a made up BS price, though, as even FE prices are often different. It's just suggested pricing for retailers for most basic version possible - and FE is already specced above most basic version. I'd really just forget about MSRP even existing, it's nothing but marketing BS as I said, instead of reality. FE prices should be more of the market average for sensible version, though we usually see some cheaper versions too (with worse cooling and board etc.).
 
Last edited:
(this does not apply to those who think buying very expensive is a badge of eliteness)
But, what else is there to luxury products like that? :D Nobody NEEDS that for gaming, people WANT that - it's a big difference. I could likely still game on xx60 card, just drop details, DLSS and there we go - playable. But I don't want to... :P
 
You paid £500 more than on release day prices... This is the problem people not even aware what the real msrp is for the AIB cards. The retailer thanks you for a £500 bonus + their profit margin.
Again, no such thing in the real life these days as MSRP - MSRP is a suggested price for most BASIC model possible of the graphics card with a given GPU. It's not a price suggested for FE or AIB cards. Totally different things. I really wish people stop using MSRP as something meaningful, it's just marketing BS. FE pricing is much more sensible to use, as that seems to be market average of a sensibly well spec model, however, it'll be higher than MSRP for sure.
 
Last edited:
I remember back in the day top tier would be similar price from one gen to another, so you got more performance for similar price.
That was because every year we had new much better production process, smaller chips and more transistors yet less power use etc. etc. Those days are over, as silicon tech is at the very wall of possibilities it seems - we don't get smaller and faster with less power use anymore each generation, now it's bigger, more power use for more speed. This has to cost more in production, especially that pushing the last few bits to the wall of possibilities gets more and more expensive too. I wonder what will happen when we reach that wall (which will happen soon) and then what? Can't push power use up forever, can't just make bigger and bigger chips or add transistors forever. Can't just add more cache, as that won't do anything anymore and the design of CPUs and GPUs are pretty polished so it's really hard to get any new gains anymore, as all low hanging fruits have been already taken. Those won't be very fun times and I reckon 10 more years or so and we're there. We already see some effects of it as is. Which is why NVIDIA CEO is pushing so hard towards AI as the only way forth - but that sounds more like desperation than actual way forth, as current AI is very primitive generative LLM models, with very little aside that. We are far away from actually proper AI, much farther than full PT with enough rays to make it really pretty in live time. :)
 
Last edited:
I think what will happen is that we might see some good pricing the gen after Blackwell, as that will be close to next gen consoles releasing.

Similar to how they did the 3080, possibly? But if they come out 6 months or more earlier than the consoles then maybe not as good of a chance.
Though I'd be surprised if they didn't try semi reasonable pricing to try and tempt console buyers away
All these predictions seem to be ignoring silicon tech reaching the wall rapidly. It's still fine to make progress in small power saving chips for consoles and mobile for a while longer, but when you look at the top end of performance there's not much left to give at all, both in CPUs and GPUs. I wouldn't expect any big gains there, it'll likely be that each gen consoles will just come closer and closer to top PC speeds as top end won't have much room to move forth left anymore.
 
you are right observing that its going to strip away the high end where that silicon will be reserved for higher paying business - until people slow investment in that space or it forks off to dedicated hardware devices.
It's also that enterprise can actually grow chips and power use for much longer than home computers can without turning it into whole central heating for the house. Home PCs have limits and GPUs are already using silly amounts of power for a gaming machine - that will likely be cut short by governments if it progresses like this. It's just not sustainable.
 
As the market got bigger, and hype trains via youtube, tech sites "leaks", forums etc cause FOMO on new levels, accelerated during lockdowns. Buying a new high end gpu at or within 6 months of launch at original pricing is the worst part of the hobby now. You got to get lucky even using bot channels. The thought of a new gen of gpu used to bring only joy, now the joy is joined with a feeling of dread for the stress to come to acquire one without being scalped :D
 
It's also that enterprise can actually grow chips and power use for much longer than home computers can without turning it into whole central heating for the house. Home PCs have limits and GPUs are already using silly amounts of power for a gaming machine - that will likely be cut short by governments if it progresses like this. It's just not sustainable.

In the shorter term, all it means for traditional pc gamers is you are second fiddle I'm afraid and reduced to the scraps and egregious pricing. It wont last forever though, lube up.
 
As the market got bigger, and hype trains via youtube, tech sites "leaks", forums etc cause FOMO on new levels, accelerated during lockdowns. Buying a new high end gpu at or within 6 months of launch at original pricing is the worst part of the hobby now. You got to get lucky even using bot channels. The thought of a new gen of gpu used to bring only joy, now the joy is joined with a feeling of dread for the stress to come to acquire one without being scalped :D

Plus the feeling of dread of how many hours or weeks/months :( you need to work to pay for a GPU these days :mad:
 
Last edited:
The 5080 seems to be almost what I would expect a 5070 tier card, a massive step down from the 5090. I guess with no real competition, Nvidia can charge whatever they want for these as well. Keeping an eye on Intel's new chips too. I might buy one just to not support Nvidia's monopoly.
 
People forget that this generation had a massive performance improvement on the Nvidia side,and the AMD side was still 40% to 50% or thereabouts on the higher end SKUs too.

What made it look worse was the upselling of the tiers,which made most of the generation appear poor.

Moreover,as Nvidia proved you don't need cutting edge nodes for decent performance - expect consumer products to be made on lagging nodes. This is totally fine as lagging nodes also have less capacity restraints AFAIK.

The reason for this is its not going to progress forever as you are envisioning. Like anything else in the hardware scene in industries the specialising will occur so "AI" compute devices will be just that and not actually GPU's anymore. Once industry moves away from the 'general' way the portion of the design that has always been for desktop plug in board will have their own mega-corp data farm units that make best use of space and power.

From the other end, you have integrated graphics which could make more of a headway, however we were promised this a long time ago. We will have to see how this AI fad plays out and settles before you can see how it has shaped the gaming GPU but you are right observing that its going to strip away the high end where that silicon will be reserved for higher paying business - until people slow investment in that space or it forks off to dedicated hardware devices.

This,and lots of other companies and governments are pouring money into AI too. They don't want to be reliant on a single company providing hardware who can charge more and more. Those companies also want to increase their margins too. Hence,why you have new CPU designs appearing too and it will be the same with AI. Also,national security implications from certain countries who don't want to beholden to one country.

Also,gaming is still a large market - if Nvidia,etc want to leave others will step in. Plus it's not in the interest of Nvidia to put all it's eggs into one basket - it makes sense for companies to make sure they diversify too.

Moreover,all the cloud gaming servers will need gaming dGPUs to run on them,so it's not like dedicated graphics cards won't exist. You might see a move towards hybrid localised/cloud processing techniques - some stuff will be done in the cloud and the rest locally.

If anything we will see the move to chiplets, with small dGPUs,which can be paired up like 3DFX was trying to do and can be re-used in everything from laptops to higher end set-ups.

This way you end having to only build one consumer grade gaming dGPU design.
 
Last edited:
That was predominately because they were amazing miners.

The scene has moved on, there are now FPGA bitstreams and ASIC units that blow away dGPU cards for efficiency and profits. Its no longer feasible to buy Ada cards for example as nvidia has reserved the scalper margins and just now charge an extra 20%+ from the Ampere era. Electricity being twice as expensive as it was has crippled most from bothering because for a dGPU to now ROI its years instead of the days from four years ago.

Apple have introduced their own M chips, as @CAT-THE-FIFTH touched on above there are "AI" cpu's already breaking in with laptops (much like phones) so it wont be long before segments of the pie establish and the gaming GPU is pushed aside to where it was a couple of decades ago.
 
People forget that this generation had a massive performance improvement on the Nvidia side,and the AMD side was still 40% to 50% or thereabouts on the higher end SKUs too.

What made it look worse was the upselling of the tiers,which made most of the generation appear poor.

Moreover,as Nvidia proved you don't need cutting edge nodes for decent performance - expect consumer products to be made on lagging nodes. This is totally fine as lagging nodes also have less capacity restraints AFAIK.



This,and lots of other companies and governments are pouring money into AI too. They don't want to be reliant on a single company providing hardware who can charge more and more. Those companies also want to increase their margins too. Hence,why you have new CPU designs appearing too and it will be the same with AI. Also,national security implications from certain countries who don't want to beholden to one country.

Also,gaming is still a large market - if Nvidia,etc want to leave others will step in. Plus it's not in the interest of Nvidia to put all it's eggs into one basket - it makes sense for companies to make sure they diversify too.

Moreover,all the cloud gaming servers will need gaming dGPUs to run on them,so it's not like dedicated graphics cards won't exist. You might see a move towards hybrid localised/cloud processing techniques - some stuff will be done in the cloud and the rest locally.

If anything we will see the move to chiplets, with small dGPUs,which can be paired up like 3DFX was trying to do and can be re-used in everything from laptops to higher end set-ups.

This way you end having to only build one consumer grade gaming dGPU design.
its a flawed premise to assume that nvidia is not the one going to lead the change, anyways 3 fundamental questions arise:
1. whats the basis for assuming that GPUs are not the right fit for AI, from what i have seen GPUS are perfect for AI workloads, all you got to do is get rid of the fixed function pipeline, which i am sure nvidia has been doing or their AI chips like getting rid of ROPs, TMUs, RT units etc... dont find them in AI chips
2. why are people not talking about smaller chips on a cutting edge node instead of larger chips on a lagging node (how will the transition be made, cant be selling the same level of performance)
3. where is the additional capacity going to come from, currently it just seems like the semicon industry has 2 critical bottlenecks
 
Back
Top Bottom