• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do you think we'll see a third player enter the high end gpu market any time soon?

No. The days for that have long gone sadly.

It costs too much money and the competition is too fierce.

The only thing that IMO would work is if Nvidia reinstated the 3DFX branding and released GPUs that differ in core count and clock speed to their main Geforce models.

They could use the branding in either the low end or high end and IMO it would work.

But to design and manufacture a GPU costs millions. And then Nvidia would simply do what they've always done and talk crap about it and the mugs would believe every last word.

The only reason that Geforce cards became so popular was down to naming and marketing. Geforce sounds fast.

In reality though I thought that the TNT Riva II 16mb was slower and more buggy than the 3DFX Voodoo 3000 card. I also preferred the Voodoo 5500 to the Geforce II because again, it had better drivers and seemed smoother and less buggy.

The thing is, whilst every one was walking around saying that the Nvidia cards of the time were faster there was hardly any way to prove it. We didn't have FRAPS back then and we didn't have any synthetic benchmarks. Not in the mainstream any way, so Nvidia got away with an awful lot in the bullshine corner of the market.

Sadly it has been shown that we like buckets of Nvidia bullshine and will happily gorge ourselves on it until all of the sodium chloride causes us to have a heart attack.
 
All of intels graphics attempts have been truly awful. Although I guess you could argue thats because they are not putting much effort into that side of things.
 
they already have cpu's, motherboards ssd's and cpu coolers - wouldnt they dominate if they started gfx as well :p

Other than SSD's, AMD do all the above too including making GPU's! (although I'm not sure they directly manufacture motherboards).

All of intels graphics attempts have been truly awful. Although I guess you could argue thats because they are not putting much effort into that side of things.

The HD3000 is getting there.
 
Intel, high end graphics? ...lol. There's a reason why they ditched their own internally developed graphics core for a PowerVR design on the new smartphone Atom, and that's because their graphics are so terrible that they couldn't compete in the ferocious smartphone SoC market. As for HD3000, AMD's Llano integrated graphics absolutely slaughters it- and that's on a low to mid range chip, whereas HD3000 is on only the highest end Intel chips (the vast majority use HD2000, which is literally half of a HD3000).

Intel tried to make high end graphics for YEARS with Larabee, throwing billions of dollars at it. It just wasn't good enough, and they've only just now been able to repurpose all that wasted research into Knight's Corner as their answer to NVidia's Tesla.
 
It'd be nice if PowerVR came back into the fold but I suppose it's difficult for a new kid on the block to gain any foothold.

They aren't really the new kid on the block ;) They had a forray into desktop gfx years ago with their Kyro and KyroII graphics cards.

They obviously decided it was too cut-throat a market to compete in.
 
I'd love for this to happen. Especially if it was intel. They certainly have the money to funnel into GPU R&D. If they did that, then combined with how advanced their fabs are, they will be able to compete with ATI and NVIDIA. But Intel only seem interested in the low end, on-chip GPUs.

Still given how big they are I'd love to see them branch out.
 
I'd love for this to happen. Especially if it was intel. They certainly have the money to funnel into GPU R&D. If they did that, then combined with how advanced their fabs are, they will be able to compete with ATI and NVIDIA. But Intel only seem interested in the low end, on-chip GPUs.

Still given how big they are I'd love to see them branch out.

They've been funnelling money into high end graphics research through Larrabee for years, and its gotten them nowhere.
 
My first graphics card was also by Trident. I think it was the TVGA 9000. It was 1MB for sure. Had it in my 80386DX 40Mhz before. Then the graphics card died and I replaced it with a 512K card from S3 or something.
 
I saw a joke once, it went something like this: 'I'm testing out Battlefield 3 on intel's new graphics chip and do you know, it looks exactly the same as it does on AMD and Nvidia chips!' Other guy says 'What's the FPS?' First guy says 'Dunno, second frame hasn't appeared yet'.
 
PowerVR could possibly enter the desktop market again, but their speciality lies in the low power end of the spectrum these days.

It will be a long time before any company could enter the high-end market, and they will probably judge it isn't worth the required initial investment.
 
interesting thought for a minute


intel owns their own fabs, unlike the gpu manufacturers.

is a 40nm fab a 40nm fab? (ie can a fab built to make 40nm cpus be easily converted to make gpu chips?

would give them something to do with all those old fabs when they move processes.
 
interesting thought for a minute


intel owns their own fabs, unlike the gpu manufacturers.

is a 40nm fab a 40nm fab? (ie can a fab built to make 40nm cpus be easily converted to make gpu chips?

would give them something to do with all those old fabs when they move processes.

I don't know the workings of fabricators that well, but at the very least they would require some re-fitting.
GPUs aren't that far behind in fab tech anyway, so I doubt its a sensible idea.
 
interesting thought for a minute


intel owns their own fabs, unlike the gpu manufacturers.

is a 40nm fab a 40nm fab? (ie can a fab built to make 40nm cpus be easily converted to make gpu chips?

would give them something to do with all those old fabs when they move processes.

Yes they can. A fab can be used to build any device. The actual architecture of the chip (i.e. whether it's a CPU or a GPU) is a much higher level consideration than the semiconductor process itself. It's because of Intel's lead on process that I'd love to see them do GPUs.
 
The desktop power PC market is a high cost to enter low margin affair that requires high levels of expertise and support and a huge marketing budget to compete.

For imagination tech it makes more sense to stick to lp devices selling to other makers as this does not cost as much to enter and is a high volume business.
 
Back
Top Bottom