Man of Honour
As a side note. Look how much higher Nvidia put Pascal than Maxwell.... yeah right! lol
That slide is showing bandwidth available on professional cards using NVLink.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
As a side note. Look how much higher Nvidia put Pascal than Maxwell.... yeah right! lol
Funny you say that as I just put mine up for sale also
Would not call it dumping though, it is a great card but I just purchased a Fury Nano from AMDMatt to finally try Freesync out on my monitor
why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.Realistically. They can't be releasing Vega to compete with Pascal. It just doesn't make sense. It's too late. And seeing as how Pascal is basically just Maxwell+. It wouldn't be impressive at all.
I'm thinking purely from a business point of view AMD must be trying to position themselves here. Thats would be how to make the most money.
Well when Pascal was being launched NVIDIA a originally stated x10 faster in tasks didn't they?
Was supposed to be a massive leap forward since they skipped 20nm which flopped and could go 16nm.
Looking at Titan X vs Titan Xp, they made great improvements, but nothing compared to what they were stating originally in slides.
I expect Vega to be the Pascal of AMDs Fiji really.
Something like 10x faster in FP16, which it was.
That doesn't make much sense, The P100 has 21.2 TFLOPS FP16 for Mezzanine, so if the Maxwell M40 was still 1:1 FP16 it would be 7 TFLOPs.
That's not x10 faster in FP16.
Nvidia’s CEO went on to state that pascal has 10x of Maxwell’s performance and he arrived at this conclusion via what he calls “CEO math”. Obviously this was just a humorous way to impress the crowd at GTC 2015 and is based on what was described as “very rough estimates”.
The idea is that if we look at all the improvements coming up with Pascal compared to Maxwell, they will collectively add up to make it “roughly” 10 times more efficient at deep learning compute tasks. Pascal will feature 3x the memory bandwidth of Maxwell, 2x peak single precision compute performance and 2x the performance per watt.
The 10x faster was a lot of Ceo maths as Jen put it.
10 times more efficient and not 10x faster
Yeah, if it the case for needing the devs, then it won't get used, wether its easy to do or not, take Mantle for example, AOTS, one guy, about an hr or 2 to do, they said it was **** easy, yet it got no interest at all, no one wanted to know about it, not even the owners of the PC gaming market, Nvidia, ...<snip>
When you say Nvidia are the “owners of the PC gaming market”, it reminds me of when Americans imply that the USA somehow owns freedom.
The market is worth around $35bn a year, if anyone owned it, it would be Valve and Blizzard et al. Riot still make $150m a month from League of Legends, which was released in 2009. So when you talk about Nvidia “owning” the PC gaming market, I don’t know WTF you’re talking about. Are you referring to the super-niche, high-end, non-workstation, gpu market?
Thanks manEnjoy that, I have found freesync to be pretty good, would not like to go back.
It might well compete with Volta. I don't know. none of us do.why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.
all volta with size up to 450mm² will be roughly the same performance as GP102, so if there is any more performance push it will come from die size, so ppl should'nt expect too much for the segments 100-500$ where vega competes, and AMD leaves 700-1000$ segment for Nvidia.
why wouldn't it compete with volta ? you do realize that pascal and volta are the same node, a revision of 16nm, this is not like the jump from 28nm to 16nm, so you will be left with minimal improvements from the node and maybe little architectural too, if there is any.
all volta with size up to 450mm² will be roughly the same performance as GP102, so if there is any more performance push it will come from die size, so ppl should'nt expect too much for the segments 100-500$ where vega competes, and AMD leaves 700-1000$ segment for Nvidia.
I strongly disagree.AMD should just forget about all this new tech, and just concentrate on the here and now, like Nvidia, and also go toe to toe with them, bringing out cards around the same time, at around the same performance, or better, not cards that match performance we've already had for years, and surpassed, as thats the only way they'll get their share up, which is what they need, they need to be up with Nvidia on it, or ahead.
Once they do that, then they can concentrate on new tech, otherwise they are just ******* in the wind, just bringing out antiques, isn't going to do it.
If you think difference between high and ultra is day and night then you sir are seeing something i am not. To me its not apparent straight away. I even have to take screenshots stood still looking at the same scene at high settings then ultra settings to see the difference. If it was night and day i would not need to do that. I would be like turn that setting up... Apply... OHHH look at that, looks much better. But nope not a single game today. Maybe back in the day with the likes of farcry or crysis when they came out you could tell the difference but games today not a chance. But you can tell the difference it takes on your GPU when it taxes it and you get less FPS.
It's only when i go from say 1440p to 4k do i notice a improvement of visual quality. So i tend to go for 4k and high settings for the best visual experience and smooth gameplay. 1440p and ultra settings does not look as good for me. Sometimes there is some gimmicky visuals that ruin the experience like bloom, DOF and (movement blurr) forgot the name.
edit -
Also i said consoles are a baseline. Meaning PC get the better visuals but consoles are the baseline. Thats why medium with some settings on low tends to be what consoles are at. But you only notice a nice visual uptake from those settings to high. After this its just miniscule and tanks performance in most cases.
I strongly disagree.
Here and now was what got them to this situation, where even people far from tech know they don't want an AMD CPU or GPU for their gaming PC.
AMD tried here and now improvements with bulldozer and with iterations of GCN. SandyBridge happened and they had no answer. Maxwell happened and Fiji flopped. Pascal happened and they had no answer.
The long term picture presented on financial day makes much more sense. After 2-3 years in development they are now in position to compete everywhere and for a long while.
I strongly disagree.
Here and now was what got them to this situation, where even people far from tech know they don't want an AMD CPU or GPU for their gaming PC.
AMD tried here and now improvements with bulldozer and with iterations of GCN. SandyBridge happened and they had no answer. Maxwell happened and Fiji flopped. Pascal happened and they had no answer.
The long term picture presented on financial day makes much more sense. After 2-3 years in development they are now in position to compete everywhere and for a long while.