• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel's Raja first discrete GPU will be based on Intel Gen12 GPU architecture

Soldato
Joined
28 Sep 2014
Posts
3,635
Location
Scotland
hvBdUFI.jpg


https://twitter.com/TMFChipFool/status/952649312479203331

abs7RUo.jpg


https://twitter.com/TMFChipFool/status/952650669839839233

Intel's Raja Koduri first discrete GPU codenamed Arctic Sound will be based on Intel Gen12 GPU architecture and second discrete GPU codenamed Jupiter Sound will be based on Intel Gen13 GPU architecture.

Arctic Sound discrete GPUs due to launch in 2019 or 2020 probably will packed hundreds of Execution Units that would compete with AMD Navi and Nvidia Ampere in mobile and high end desktop discrete GPUs. AMD probably going to lose potential huge GPU marketshare and Nvidia lose some to Intel.
 
Arctic Sound discrete GPUs due to launch in 2019 or 2020 probably will packed hundreds of Execution Units that would compete with AMD Navi and Nvidia Ampere in mobile and high end desktop discrete GPUs. AMD probably going to lose potential huge GPU marketshare and Nvidia lose some to Intel.

Its not going to compete GeForce and AMD gaming cards, not unless nVidia and AMD are giving away graphics IP.
That's not what this is, its Intels answer to Machine Learning and parallel computing, you're not going to be choosing between AMD, nVidia and Intel GPU for your graphics in 2020 or ever..

PS: AMD have graphics market share? i thought nVidia owned all but about ~10% of it now?
 
How do you see that working then.

AMD and Nvidia own excessive amounts of IP related to graphics tech.

Several vendors have complete 2d/3d accelerated graphic hardware so not sure what the patent position is - Intel's SoC's using their own hardware can do complete DX11, etc. rendering, ARM have fully functional GPUs and so on. I'm not sure what the difference would be in that respect between Intel's integrated jobbies and a scaled up performance version of that.

The bigger problem for Intel, if they went for desktop performance discrete GPUs is going to be the whole ecosystem around it software wise and evolving hardware and working with developers especially as APIs, etc. change - their current mentality would see them lagging behind nVidia, etc. there significantly.
 
Several vendors have complete 2d/3d accelerated graphic hardware so not sure what the patent position is - Intel's SoC's using their own hardware can do complete DX11, etc. rendering, ARM have fully functional GPUs and so on. I'm not sure what the difference would be in that respect between Intel's integrated jobbies and a scaled up performance version of that.

The bigger problem for Intel, if they went for desktop performance discrete GPUs is going to be the whole ecosystem around it software wise and evolving hardware and working with developers especially as APIs, etc. change - their current mentality would see them lagging behind nVidia, etc. there significantly.

Intel's current SoC are nVidia's Fermi IP, so they have that, in-terms of graphics its not going to compete with nVidia or Radeon, just look at Intels 8'th gen integrated graphics compared to AMD's, on the same power envelope and die size AMD's integrated graphics are as much as 3 times as fast.

Why do you think Intel are buying in AMD's graphics now to glue onto their own SoC's :)
 
Intel's current SoC are nVidia's Fermi IP, so they have that, in-terms of graphics its not going to compete with nVidia or Radeon, just look at Intels 8'th gen integrated graphics compared to AMD's, on the same power envelope and die size AMD's integrated graphics are as much as 3 times as fast.

Why do you think Intel are buying in AMD's graphics now to glue onto their own SoC's :)
Put like that it makes a lot of sense. Perhaps Intel decided to give up on graphics and instead put the focus on leaning/compute where they can compete and the market is younger.
 
Its not going to compete GeForce and AMD gaming cards, not unless nVidia and AMD are giving away graphics IP.
That's not what this is, its Intels answer to Machine Learning and parallel computing, you're not going to be choosing between AMD, nVidia and Intel GPU for your graphics in 2020 or ever..

Er no this Gen12 architecture is not Intels answer to Machine Learning and parallel computing, Intel already had answer to Machine Learning, AI and parallel computing with Intel Nervana codenamed Lake Crest launched back in October 2017 and Lake Crest successor will be codenamed Knights Crest, Nervana GPUs are all based on Tensor architecture. Also Intel already had answer to servers, Deep Learining, HPC and supercomputers with Intel Xeon Phi GPGPUs codenamed Knights Mill launched last month in December 2017 and Knights Mill successor will be codenamed Knights Peak based on 7nm process as Intel cancelled Knights Hill which planned to use 10nm, all Xeon Phi GPGPUs are based on x86 Many Integrated Core Architecture.

So this Gen12 would be absolutely utter useless at machine learning and parallel computing, servers, HPC and supercomputers, Gen12 is not designed to do that type of apps so Gen12 is like all past 12 generations that are only good at watching media, video, run apps, working on word processing, spreadsheets, databases and GAMING! :D

It really shame Intel Nervana GPUs and Xeon Phi GPGPUs all cant run Crysis, it just cant run any games.

PS: AMD have graphics market share? i thought nVidia owned all but about ~10% of it now?

In Q3 2017 Nvidia owned 73% of GPU AIB market while AMD had 27% so Q4 would be much worse for AMD with prices gone through roof with all Vega reference cards gone and no Vega custom cards for almost 2 months I think.

We will have to wait and see Raja Koduri will do with Arctic Sound, it may be the first Intel discrete GPU that could be able to run games up to 4K resolution with maxed settings at ease. Hopefully Arctic Sound will be very good at GPU mining. :)
 
Funny, you said Raja was poop when he worked for AMD. Does the fact that his paycheck is from Intel now verify his competency?

Just asking.
 
How does Intel bypass Ip license. By teaming up with AMD ;)

I said it before raja to Intel, amd and Intel releasing partner products.
They is a lot more to this...
 
AMD's masterplan? Send Raja to Intel and bring them into the dGPU game to fight Nvidia. They both squeeze Nvidia to much lower market share in the gaming and compute GPU markets whilst in the CPU markets they only have each other to compete against.
 
How do you see that working then.
AMD and Nvidia own excessive amounts of IP related to graphics tech.
It's not so much the hardware that the problem for Intel IMO it's the software. When was the last time you saw Intel post a graphics driver update? One of the biggest reasons why Nvidia are so strong is historically they have always provided users with the best and most stable software which helped them stand out from the crowd in the early 3D era when there was lots of vendors competing with each other. ATI were also successful in that era but had a reputation for poor drivers and software (I believe the worst time was when they lunched there 'Rage' lineup circa 1999) which has taken AMD years and years to shake off, Intel has to the same and frankly unless they invest a serious amount of money into this venture it won't work.
 
AMD are all but out and probably aren't going to recover, whichever way you cut it nVidia have no competition, they are effectively on their own.
That will 'and probably already has' slow Graphics releases and push the prices up, in fact had nVidia any real competition in the last few years they wouldn't be renaming what was GTX #80 cards "Titan" and doubling the price for some unchallenged profiteering.

With AMD now effectively out we need a Leviathan like Intel to come in and take on nVidia, sadly that's not going to happen because nVidia have IP that is intrinsic to 'high performance graphics' its intrinsic because its embedded into modern game development.
What little IP Intel have access to is old and basic, they cannot compete with it in the graphics world, the only hope we have for a competitor in Intel is if AMD are licensing graphics IP to them.

This is plausible given that AMD don't have the money to compete with nVidia in the graphics space, but Intel do, AMD have the technology.
 
Last edited:
AMD's masterplan? Send Raja to Intel and bring them into the dGPU game to fight Nvidia. They both squeeze Nvidia to much lower market share in the gaming and compute GPU markets whilst in the CPU markets they only have each other to compete against.

I don't know about master plans but AMD and Intel are no threat to eachother if they come to an agreement in carving up X86 between them.

Intel do however have a very real threat in nVidia, as do AMD, a common enemy, working together they could stop nVidia's march to total computing domination and if they are smart (which Intel certainly are) they will realize nVidia's threat and team up with their old arch rivals.
 
Ask yourself do Amd need Intel has much has Intel need AMD.
After all AMD have Ryzen + vega products. So amd don't need the CPU hardware.

What AMD does gain from joining Intel is marketing/mind share.
Intel gain hardware "GPU"
In the end its could be a win, win situation for both companies.
 
I think the maths won out in Intel HQ. Spend billions (more) iterating GPU tech that isn't that good at or buy an off the shelf solution in AMD's semi-custom line. If it weren't for competition laws, they would have probably bought AMD/RTG outright.
 
If they released a card that was good for gaming but crap at mining they'd be able to take massive gaming market share quite quickly.
Ir yhey can make card goid for mining and giming and. Earn *********.

You forget how hard it is to get NV fabatics to try different card. Its like trying to get appke chap to buy android phone.

So crap in mining card wpukd be shot in foot. Cause if gamers would not jump then at least miners will buy them.

And if gamers and mi ers wont buy cause nv or amd got better then what??


Besides that many here wont remwmber that Ibtel had few GPUs around AvP1 time. My mate had it and drivers ware Shicking. Trully worst dpu driver suppirt ie seen in my life.

Even my S3 virge and S3 savage ware kess pronlematic.
 
Back
Top Bottom