• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you think AMD will be able to compete with Nvidia again during the next few years?

Do you think AMD will be able to compete with Nvidia again during the next few years?


  • Total voters
    213
  • Poll closed .
It had all to do with the costs. Everyone says this.

Really, everyone says this? Two people in this thread have already told you it wasn't. Costs are a factor in every business decision but, it wasn't the primary reason for going with AMD. Neither Nvidia or Intel could provide the solution. Sure if you been pedantic, maybe Nvidia could have spent Billions designing an x86 APU. OR Intel could have partnered with Nvidia to make one, but, we all know that Nvidia and Intel don't play nice, so that wasn't going to happen.

At the end of the day Sony and Microsoft decided to go with x86 SOC design for their Consoles. Only one company had everything in place to meet that design and that was AMD.
 
AMD could offer a competitive price since they had the tech in place,and also had experience with working with both TSMC and GF for both CPU and GPU products,which would have been important for dual sourcing. For example the Jaguar core was designed for easier portability between different process nodes apparently.
 
It had all to do with the costs. Everyone says this.
Adding to @melmac and @CAT-THE-FIFTH comments, money wasn't the only reason.

Consoles. Sony required a powerful (for consoles) SOC with async compute. None else than AMD has this in 2013 and Nvidia doesn't even have this today.
(software async compute won't work on consoles). And the latter will NEVER obtain an x86 licence.

MS following also wanting such tech in addition to it's games been closed and easily portable to PC through their Store without the need to re-write it. Again neither Intel or Nvidia could provide such product.
Also both were bitten by Nvidia on the previous round of consoles

The latter is the reason Apple ditch Nvidia also. AMD was easier to discuss spec and provide the products required without strings attached.

For Nvidia that wasn't the case. GT8600M is a good example why Apple had a good reason to ditch Nvidia. It even burned the laptops, because its electricity requirements were out of spec. And Nvidia wasn't acknowledging it was their product the cause.

You cannot have these in the corporate world. AMD has a good name as a partner. And is the reason even Intel is using AMD GPUs on their Kabylake G products, and not Nvidia products.
 
Yes quite agree and sorry for my part in that.

So back on topic, can AMD catch back up?

Looking at the poll that has been added to this thread, it would seem that the "No, NVIDIA have too much of a lead to overcome" is leading the way with 35.8% of the vote, which quite surprised me, as I feel as do 21.9% of the other voters, that yes AMD will catch up but it is going to take a while yet. If you also count the other 26.4% that think it might happen with Navi then it is 48.32 or nearly half of the 201 voters, who do all agree that AMD will be able to catch back up we just don't agree on how long it might take.

So with that sorted, we can ask ourselves, just what AMD need to do to catch back up. Now of course it would be easy to just say, just build and release some competitive GPU's. But I feel there is more to it than that.

Obviously GPU designs don't make themselves and we all know that AMD have been having a hard time of it financially. Now hopefully the success AMD are having with Ryzen and their other Zen based CPU's will give them a nice cash boost to put into GPU design and that will take time to come to fruition, which is why I voted "Yes but it will take 3 generations". As AMD have shown their Polaris based 500 series is easily a match for the lower end competitions GPU's and their latest Vega design's can keep up with NVidia, up to the 1080, so building on that they need to refresh the Vega line to bring it up to the next tier. Certainly not beyond their capability. I suppose a lot will depend on how far NVidia raise the bar with the next gen stuff, so to speak.

Your thoughts, so we can have a sensible discussion about it?

Sell more cards to miners for £1000 a pop?
 
Do you remember that the polaris/vega advert had the words "poor volta" in it.
Ridiculous! considering it couldn't even compete with pascal let alone what volta will bring.
That really disappointed me. All felt very "no man's sky". There is a difference between clever marketing and utter bullpoo.
I hope AMD come back strong but as others have said every day that passes they slip further and further behind. To be honest I am pinning my hopes on Intel.
 
Do you remember that the polaris/vega advert had the words "poor volta" in it.
Ridiculous! considering it couldn't even compete with pascal let alone what volta will bring.
That really disappointed me. All felt very "no man's sky". There is a difference between clever marketing and utter bullpoo.
I hope AMD come back strong but as others have said every day that passes they slip further and further behind. To be honest I am pinning my hopes on Intel.

Do you realise that the video about "Poor Volta" was in relation to compute and not game graphics?
On that the Instict is an absolute beast especially given the price, and it's 7nm incarnation even walks over the Volta computing GPUs.
 
Do you realise that the video about "Poor Volta" was in relation to compute and not game graphics?
On that the Instict is an absolute beast especially given the price, and it's 7nm incarnation even walks over the Volta computing GPUs.


More complete rubbish. The "poor Violta" marketing had a Radeon label all over it, presented at a Vega gaming preview event.

vega does not compete with Volta in terms of compute at all. It has a completely castrated FP64 support and volta walks all over it for deep learning.
The 7nm Vega 2.0 will be different, but that is not released yet and you have no idea what Nvidia will be releasing on 7nm.

why do you repeatedly come up with these outrageous lies?
 
More complete rubbish. The "poor Violta" marketing had a Radeon label all over it, presented at a Vega gaming preview event.

vega does not compete with Volta in terms of compute at all. It has a completely castrated FP64 support and volta walks all over it for deep learning.
The 7nm Vega 2.0 will be different, but that is not released yet and you have no idea what Nvidia will be releasing on 7nm.

why do you repeatedly come up with these outrageous lies?

Did you see any mainstream Volta graphic card? No.
Only the Titan V behemoth, so you do not know how these could have performed.
 
Do you realise that the video about "Poor Volta" was in relation to compute and not game graphics?
On that the Instict is an absolute beast especially given the price, and it's 7nm incarnation even walks over the Volta computing GPUs.
Not a CHANCE. The advert was clearly, purposefully and entirely designed for and aimed at gamers.
 
Adding to @melmac and @CAT-THE-FIFTH comments, money wasn't the only reason.

Consoles. Sony required a powerful (for consoles) SOC with async compute.
You will have to provide evidence for that clain. I doubt Sony cared about async compute ata ll, it makes a few percent difference on a GPU that has bad load balancing and requires developer resources to utilise. Sony would ahve much prefered simply the extra 5% performance wihtout any developer work.

None else than AMD has this in 2013 and Nvidia doesn't even have this today.
(software async compute won't work on consoles). And the latter will NEVER obtain an x86 licence.
Nvidia doesn't have software async compute, it is all done in hardware. The scheduling is done in software because this is much more flexible and can achieve higher performance rather than a fixed scheduling function. This is much more advnaced than a pure software scheduler, which Nvidia had in Fermi.

MS following also wanting such tech in addition to it's games been closed and easily portable to PC through their Store without the need to re-write it. Again neither Intel or Nvidia could provide such product.
Also both were bitten by Nvidia on the previous round of consoles

The latter is the reason Apple ditch Nvidia also. AMD was easier to discuss spec and provide the products required without strings attached.

For Nvidia that wasn't the case. GT8600M is a good example why Apple had a good reason to ditch Nvidia. It even burned the laptops, because its electricity requirements were out of spec. And Nvidia wasn't acknowledging it was their product the cause.

You cannot have these in the corporate world. AMD has a good name as a partner. And is the reason even Intel is using AMD GPUs on their Kabylake G products, and not Nvidia products.

Apple has also been burned by AMDs GPUs such as the constantly failing 6970s etc.

In the real world Apple doesn't care much about that, they care more about pricing.



As to the consoles, Sony and MS wanted a cheap SOC, AMD could provide that and were willing to minimize any margins to secure a deal. That is not very surpsing. Intel forcing a monolopy on X86 licening prevented Nvidia being a competitor in this space.
 
Not a CHANCE. The advert was clearly, purposefully and entirely designed for and aimed at gamers.

Did you see any mainstream Volta graphic card?
There was only the Titan V costing an arm and a leg, having a behemoth chip under it (812 mm2, 21.1million transistors, HBM2), but was only ~24% faster than the GTX1080Ti FE (471mm2, 12million transistors, GDDR5X). A chip HALF it's size almost.

Any good overclocked GTX1080Ti (Xtreme, Lightning etc) could be very close to it on single digits pef difference.
 
Did you see any mainstream Volta graphic card? No.
Only the Titan V behemoth, so you do not know how these could have performed.

What has this got to do with anything? Vega doesn't compete with Pascal, let alone Volta, and very soon you will get to see what Volta in a mainstream gaming configuration will do.

You just have to admit that AMD's propaganda caimpain was terrible, why try and lie about what AMD said when it is all there in black and white?
 
What has this got to do with anything? Vega doesn't compete with Pascal, let alone Volta, and very soon you will get to see what Volta in a mainstream gaming configuration will do.

You just have to admit that AMD's propaganda caimpain was terrible, why try and lie about what AMD said when it is all there in black and white?

It has to do with everything. They didn't wrote "poor Pascal".
And you haven't seen a single mainstream Volta card you only believe that it will be better than Pascal.

However I point at you the Titan V and it's benchmarks in comparison to GTX1080Ti FE and you ignore them.
Is a chip twice the size than the GTX1080Ti with much faster VRAM, yet only ~24% faster than the GTX1080Ti FE (not the best sample either)
If you chop HALF the Volta chip off to bring it in size to the GTX1080Ti, what performance could have?
That's extrapolation as in mathematics, and not blind belief.
 
It may be funny but I think that was sarcasm against themselves and the required VOLTAge levels and power consumption to reach GTX 1080 levels, let alone GTX 1080 Ti levels.
 
It has to do with everything. They didn't wrote "poor Pascal".
And you haven't seen a single mainstream Volta card you only believe that it will be better than Pascal.

However I point at you the Titan V and it's benchmarks in comparison to GTX1080Ti FE and you ignore them.
Is a chip twice the size than the GTX1080Ti with much faster VRAM, yet only ~24% faster than the GTX1080Ti FE (not the best sample either)
If you chop HALF the Volta chip off to bring it in size to the GTX1080Ti, what performance could have?
That's extrapolation as in mathematics, and not blind belief.

WTF are You taking cause smoking does not provide that much of delusional experience.
30 second google
https://www.anandtech.com/show/12170/nvidia-titan-v-preview-titanomachy/5

93694.png


Maybe I'm blind but this looks like 200% to me not 23....


And All here know that Vega was PATHETIC piece of late junk.
 
Back
Top Bottom