• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
As a side note. Look how much higher Nvidia put Pascal than Maxwell.... yeah right! lol
fdfdfdf.png

That slide is showing bandwidth available on professional cards using NVLink.:)
 
Realistically. They can't be releasing Vega to compete with Pascal. It just doesn't make sense. It's too late. And seeing as how Pascal is basically just Maxwell+. It wouldn't be impressive at all.
I'm thinking purely from a business point of view AMD must be trying to position themselves here. Thats would be how to make the most money.
cards.png
why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.
all volta with size up to 450mm² will be roughly the same performance as GP102, so if there is any more performance push it will come from die size, so ppl should'nt expect too much for the segments 100-500$ where vega competes, and AMD leaves 700-1000$ segment for Nvidia.
 
Well when Pascal was being launched NVIDIA a originally stated x10 faster in tasks didn't they?

Was supposed to be a massive leap forward since they skipped 20nm which flopped and could go 16nm.

Looking at Titan X vs Titan Xp, they made great improvements, but nothing compared to what they were stating originally in slides.

I expect Vega to be the Pascal of AMDs Fiji really.


Something like 10x faster in FP16, which it was.
 
That doesn't make much sense, The P100 has 21.2 TFLOPS FP16 for Mezzanine, so if the Maxwell M40 was still 1:1 FP16 it would be 7 TFLOPs.

That's not x10 faster in FP16.

The 10x faster was a lot of Ceo maths as Jen put it.

Nvidia’s CEO went on to state that pascal has 10x of Maxwell’s performance and he arrived at this conclusion via what he calls “CEO math”. Obviously this was just a humorous way to impress the crowd at GTC 2015 and is based on what was described as “very rough estimates”.

The idea is that if we look at all the improvements coming up with Pascal compared to Maxwell, they will collectively add up to make it “roughly” 10 times more efficient at deep learning compute tasks. Pascal will feature 3x the memory bandwidth of Maxwell, 2x peak single precision compute performance and 2x the performance per watt.

10 times more efficient and not 10x faster
 
Yeah, if it the case for needing the devs, then it won't get used, wether its easy to do or not, take Mantle for example, AOTS, one guy, about an hr or 2 to do, they said it was **** easy, yet it got no interest at all, no one wanted to know about it, not even the owners of the PC gaming market, Nvidia, ...<snip>

When you say Nvidia are the “owners of the PC gaming market”, it reminds me of when Americans imply that the USA somehow owns freedom.

The market is worth around $35bn a year, if anyone owned it, it would be Valve and Blizzard et al. Riot still make $150m a month from League of Legends, which was released in 2009. So when you talk about Nvidia “owning” the PC gaming market, I don’t know WTF you’re talking about. Are you referring to the super-niche, high-end, non-workstation, gpu market?
 
AMD should just forget about all this new tech, and just concentrate on the here and now, like Nvidia, and also go toe to toe with them, bringing out cards around the same time, at around the same performance, or better, not cards that match performance we've already had for years, and surpassed, as thats the only way they'll get their share up, which is what they need, they need to be up with Nvidia on it, or ahead.

Once they do that, then they can concentrate on new tech, otherwise they are just ******* in the wind, just bringing out antiques, isn't going to do it.
When you say Nvidia are the “owners of the PC gaming market”, it reminds me of when Americans imply that the USA somehow owns freedom.

The market is worth around $35bn a year, if anyone owned it, it would be Valve and Blizzard et al. Riot still make $150m a month from League of Legends, which was released in 2009. So when you talk about Nvidia “owning” the PC gaming market, I don’t know WTF you’re talking about. Are you referring to the super-niche, high-end, non-workstation, gpu market?

Everytime a game comes out for our PCs (barring the odd one, here, there), its got Nvidia slapped on it, so to me, they own the market, our PC games, are done for Nvidia, as its their cards thats are in just about every PC a going, its console games that are done for AMD hardware, but thats a different market.

Unless we get AMDs share up with, and over Nvidias on our PCs, then nothing will change.
 
Last edited:
why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.
all volta with size up to 450mm² will be roughly the same performance as GP102, so if there is any more performance push it will come from die size, so ppl should'nt expect too much for the segments 100-500$ where vega competes, and AMD leaves 700-1000$ segment for Nvidia.
It might well compete with Volta. I don't know. none of us do.
Im ust saying from a business sense that is a sensible output.
 
why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.
all volta with size up to 450mm² will be roughly the same performance as GP102, so if there is any more performance push it will come from die size, so ppl should'nt expect too much for the segments 100-500$ where vega competes, and AMD leaves 700-1000$ segment for Nvidia.

Someone wasn't paying attention to nVidia's GPU Tech Conference - while mostly concerning itself with the professional cards there are loads of changes over Pascal:

https://devblogs.nvidia.com/parallelforall/inside-volta/

12FF is going into risk production next week I believe so if nVidia is planning on bringing consumer Volta to GeForce on a normal kind of timescale it will likely show up sooner rather than later - this also gives a potential 10% performance increase or 25% power reduction but through a mixture of area reduction and other techniques some designs can attain potentially upto 33% performance increase.
 
AMD should just forget about all this new tech, and just concentrate on the here and now, like Nvidia, and also go toe to toe with them, bringing out cards around the same time, at around the same performance, or better, not cards that match performance we've already had for years, and surpassed, as thats the only way they'll get their share up, which is what they need, they need to be up with Nvidia on it, or ahead.
Once they do that, then they can concentrate on new tech, otherwise they are just ******* in the wind, just bringing out antiques, isn't going to do it.
I strongly disagree.
Here and now was what got them to this situation, where even people far from tech know they don't want an AMD CPU or GPU for their gaming PC.
AMD tried here and now improvements with bulldozer and with iterations of GCN. SandyBridge happened and they had no answer. Maxwell happened and Fiji flopped. Pascal happened and they had no answer.

The long term picture presented on financial day makes much more sense. After 2-3 years in development they are now in position to compete everywhere and for a long while.
 
If you think difference between high and ultra is day and night then you sir are seeing something i am not. To me its not apparent straight away. I even have to take screenshots stood still looking at the same scene at high settings then ultra settings to see the difference. If it was night and day i would not need to do that. I would be like turn that setting up... Apply... OHHH look at that, looks much better. But nope not a single game today. Maybe back in the day with the likes of farcry or crysis when they came out you could tell the difference but games today not a chance. But you can tell the difference it takes on your GPU when it taxes it and you get less FPS.

It's only when i go from say 1440p to 4k do i notice a improvement of visual quality. So i tend to go for 4k and high settings for the best visual experience and smooth gameplay. 1440p and ultra settings does not look as good for me. Sometimes there is some gimmicky visuals that ruin the experience like bloom, DOF and (movement blurr) forgot the name.

edit -

Also i said consoles are a baseline. Meaning PC get the better visuals but consoles are the baseline. Thats why medium with some settings on low tends to be what consoles are at. But you only notice a nice visual uptake from those settings to high. After this its just miniscule and tanks performance in most cases.

I said the difference between ((medium) and (high or ultra)), not between high and ultra :)
 
I strongly disagree.
Here and now was what got them to this situation, where even people far from tech know they don't want an AMD CPU or GPU for their gaming PC.
AMD tried here and now improvements with bulldozer and with iterations of GCN. SandyBridge happened and they had no answer. Maxwell happened and Fiji flopped. Pascal happened and they had no answer.

The long term picture presented on financial day makes much more sense. After 2-3 years in development they are now in position to compete everywhere and for a long while.

I am not sure that we can decide here and how and what AMD should do. In my eyes, you have to offer a product first and then if you want to innovate, you can. If all you have is a different product you can either be extremely successful if it's amazing or fall flat on your face if it isn't. There's nothing for AMD to fall back to if they fail with Vega. Not long now, hopefully, before we know for sure.
 
I strongly disagree.
Here and now was what got them to this situation, where even people far from tech know they don't want an AMD CPU or GPU for their gaming PC.
AMD tried here and now improvements with bulldozer and with iterations of GCN. SandyBridge happened and they had no answer. Maxwell happened and Fiji flopped. Pascal happened and they had no answer.

The long term picture presented on financial day makes much more sense. After 2-3 years in development they are now in position to compete everywhere and for a long while.

The opposite I think - Bulldozer they tried pushing more but slower cores trying to push an envisioned future where things were more parallel and less dependant on IPC, etc. and it just didn't happen while Intel concentrated on a CPU that primarily could run existing software very fast and secondary had an eye to the future - like with these new features on Vega they seem to just assume by putting them out there people will start using them and historically it hasn't proven to be the case in the majority of cases - same with GCN while nVidia concentrated on what worked best with DX11 AMD tried pushing an architecture more suited to advanced DX12 use but no one took up DX12 type APIs in any kind of hurry.

I hope I'm wrong but Vega is starting to look kind of a turkey with many of its potential bleeding edge improvements that could have really made it shine increasingly appearing to having some substantial caveats to actual real world leveraging of those features.
 
Status
Not open for further replies.
Back
Top Bottom