• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The future

Soldato
Joined
22 Aug 2008
Posts
8,338
Lately I have been thinking about what's going to happen when silicon reaches the end of its rope. AMD & NV are already swimming upstream with these tiny lithography processes (20nm), TSMC employees must be ridden with stomach ulcers and other stress-related illnesses. The IP that goes in to making GPUs has been built up over decades and it's very hard if not impossible for a new player to enter the market.

But when silicon is replaced, will that mean a fresh start? Will there be a gold rush of new GPU companies? It's very sad to think we might just have 2 companies or go down to 1 for the rest of time.
 
I don't think we will see the end of silicon for some time yet. I think if needed you will see more dual and even quad GPU cards. There is nothing to stop NVidia putting 4 x GK110 chips on a single board and watercooling it.

We all think of 4 GPUs as the maximum, why not 8 or even 16 running at lower clocks and producing less heat.

I think people will have to start moving away from the one big GPU is better view. If done right quad GPUs looks better with less glitches than either a single or dual GPU setup.
 
We all think of 4 GPUs as the maximum, why not 8 or even 16 running at lower clocks and producing less heat.

I think people will have to start moving away from the one big GPU is better view. If done right quad GPUs looks better with less glitches than either a single or dual GPU setup.

Yes if you have the money Kaaps"cough."

I got my SLI 670's when OCUK had that mega price 4GB 670-£200 + EK FC Blocks with Backplate-£100x2=£600, still a lot of money but it depends what card model/make you get as well.

Also you would not game at 1920x1080p as well at least 27" monitor 2560x1440p depending on amount of Vram.

Still loving my SLI 670's at 2560x1440p even more since l upgraded from my i7 930:4.0ghz to 4820k:4.6ghz.:)
 
Don't GPUs technically already have thousands of "cores"? :p

IIRC that's what some were calling SPs/shaders back when they were new. I would liken a dual-GPU card to a dual-socket mobo. Of course with 16nm and chip stacking etc things could change significantly, we might not recognize a graphics card from 2020 if we saw it today.
 
Influx of new companies, no.

Ultimately if whatever might eventually replace silicon comes through... it's going to be expensive and difficult.

While ultimately lets say for example if they move to graphene, that today Nvidia/AMD/Intel have precisely as much experience making gpu's on graphene as you and me do, they have an IP portfolio, deep pockets and engineers in place to produce gpu's on an industrial scale.

A new company will only be able to match AMD/Nvidia in terms of the new process, not the quality of the gpu architecture, nor the game dev/industry support, the teams in place, drivers to work with past games.

The only way I can see there being a "new" name in gaming GPU's is if someone already making GPU's attempts to step up to the high end gaming bracket or if someone bought out Nvidia/AMD's gpu devision... but then that would be the same as now just under a new name.

You have several pretty large players in graphics that aren't AMD/Nvidia, in terms of volume Arm's Mali, Imagination, a couple others, they spank AMD/Nvidia all over the place. But none of them have attempted to attack AMD/Nvidia in the desktop pc/gaming space because the support/history/etc just isn't there. That is things that takes years to develop while basically being a huge profit sink for years. Any new gaming gpu maker would have to design an entire architecture for gaming, or adapt an existing one to a larger scale(not nearly as easy as some.... like Pottsey, would have you believe with the theoretical billion core version of whatever they have in mobile), which will cost closer to billions than millions. There will be no support for them in existing games, no optimisation so if they make something that works perfectly with DX11, it will still have multiple bugs in most games. If they target mainstream games and adapting drivers, they'll still have very low sales and inexperienced support staff for 2-3 generations, in which you're talking about sinking 100's of millions attempting to get the support that might be the foundation for increasing market share. All the while fighting to established companies with that support, better experience and good architectures.

New process nodes won't make a difference really, the manufacturing isn't easy, but is effectively a new ball game every 2-3 years already. double/triple patterning and finfet designs needs massive new knowledge and skill and completely different design rules, so will whatever is the next process tech. Architecture is what makes the cards good or not.

IF AMD or Nvidia folded I could potentially see one of the existing mobile gpu makers thinking about making a play for some of the high end market, but it's a very expensive area in R&D for relatively little in profits with two hugely established competitors so currently I don't think anyone would try.

Matrox were the last company to try, and for all their work they made something that wasn't good enough. They realised they could stick it out, improve every generation and be competitive in a few years, or get out now and save the money as they couldn't be certain of profit even if they were competitive. R&D today vs what 10 years ago is a magnitude higher, riskier and not really any more profitable.
 
Back
Top Bottom