• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Do you think AMD will be able to compete with Nvidia again during the next few years?

Do you think AMD will be able to compete with Nvidia again during the next few years?


  • Total voters
    213
  • Poll closed .
Associate
Joined
10 Jul 2009
Posts
1,559
Location
London
Where is your source for the number above?

Must be alternative universe. I know they have quite a few lower end cards overstocked, but nothing Nvidia wouldn't be able to shift with their abilities. Though even if they fail to get rid of them and write them off, that's a blip in one quarter or maybe two, and it should be dwarfed by their other sales.
 
Soldato
Joined
28 May 2007
Posts
18,263
AMD physically can't compete right now with an aging GCN architecture.

Imagine Bulldozer vs Core. We can hope that they have a new architecture coming through like Zen - eventually - did. But look how many years AMD were left flogging a dead horse (Bulldozer) before Zen arrived.

Same deal now on the graphics side. GCN is done but it's all they have right now
But AMD are more than competing with Nvidia. AMD are beating Nvidia in every segment they have a product...
 
Caporegime
Joined
18 Oct 2002
Posts
39,329
Location
Ireland
What is Nvidia's answer to a £300-350 Vega card?

What is Nvidia's answer to a £170 RX570? or a sub £200 RX580. Even older architecture than Vega.

Its funny how if a company isn't competing at the bleeding edge then they are considered not to be competing. Something like a 1080ti or 2080 ti in terms of sales doesn't compete with mid or lower range offerings in terms of sheer numbers sold.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Its funny how if a company isn't competing at the bleeding edge then they are considered not to be competing. Something like a 1080ti or 2080 ti in terms of sales doesn't compete with mid or lower range offerings in terms of sheer numbers sold.
It's got nothing to do with the bleeding edge.

AMD's mid-range offerings are extremely power hungry, and harder to cool. The excessive amount of power they need doesn't translate to "better than nVidia at every price point"... far from it. Which is a ridiculous thing to say, because "better" entirely depends on your criteria.

Unfortunately the fanboys use whatever criteria shows the object of their affections in the best light at the time, and are prone to making silly statements such as the above.

My criteria haven't changed because I couldn't give a flying toss about either company; I just want an efficient, quiet, value for money mid-range card. And Polaris is a bust by those measures.
 
Soldato
Joined
28 May 2007
Posts
18,263
It's got nothing to do with the bleeding edge.

AMD's mid-range offerings are extremely power hungry, and harder to cool. The excessive amount of power they need doesn't translate to "better than nVidia at every price point"... far from it. Which is a ridiculous thing to say, because "better" entirely depends on your criteria.

Unfortunately the fanboys use whatever criteria shows the object of their affections in the best light at the time, and are prone to making silly statements such as the above.

My criteria haven't changed because I couldn't give a flying toss about either company; I just want an efficient, quiet, value for money mid-range card. And Polaris is a bust by those measures.

If an extra 40 watts power use is extremely important to you then you must live a very interesting life. Do you turn the gas off when you turn your toast over?
 
Associate
Joined
6 Dec 2013
Posts
1,877
Location
Nottingham
in all seriousness i have a work college; who complains that his 1080 only has 1 power connector; as if it had two he could over clock it more. its limited in terms of power available to the core etc.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
18,263
in all seriousness i have a work college; who complains that his 1080 only has 1 power connector; as if it had two he could over clock it more more. its limited in terms of power available to the core etc.

Yeah Nvidia hamstrings most of its line up. The GTX1080 is at least 50 watts slower than it should be. Get him a nice 8 core Intel chip.
 
Soldato
Joined
14 Aug 2009
Posts
2,795
Nope. Nothing AMD themselves could do to change this. The only way that that AMD would have a chance to winning the market back is if Nvidia somehow royally shoot themselves in the foot with their huge ego and completely p off the consumers, in ways like Bethesda did.

nVIDIA messing up would help AMD, but AMD can help themselves if they have consistency and what they're doing and little more involvement with the game devs.

For instance R290/X were perceived bad at launch due to their cooler, which was quite loud and made the cards throttle down. Fury X was "the overclocker's dream" while the Vega cards were pushed far beyond their optimal frequencies and voltage, for minimal gains, just to be on a level perceived ok by AMD - and these are just a few examples.

Then you have the software side where they have shown some nice stuff such AI on the GPU even as far as HD48xx series, plus low level APIs. I get around 20% increase, in game, on an overclocked R290 between DX11 and DX12 in Hitman on 5760x1080 (3x 1080p) with almost all settings turned up (just shadow res on medium instead of high and SS on 1.0)! I'd say that's an incredible huge gain! However, their push with devs in general for implementing such APIs that would be extremely beneficial for their products is ... mediocre at best even though they have the consoles market on their side.

Sadly AMD stand is: hey, we've got this nice stuff, it's free to implement, have fun using it! They forget developers are not mainly driven by passion, but by money and time constrains! When AMD will decide to actually do something about their perceived place in the market (by the consumers), then they can help themselves climb up from the hole they've dug themselves in.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
nVIDIA messing up would help AMD, but AMD can help themselves if they have consistency and what they're doing and little more involvement with the game devs.

For instance R290/X were perceived bad at launch due to their cooler, which was quite loud and made the cards throttle down. Fury X was "the overclocker's dream" while the Vega cards were pushed far beyond their optimal frequencies and voltage, for minimal gains, just to be on a level perceived ok by AMD - and these are just a few examples.

Then you have the software side where they have shown some nice stuff such AI on the GPU even as far as HD48xx series, plus low level APIs. I get around 20% increase, in game, on an overclocked R290 between DX11 and DX12 in Hitman on 5760x1080 (3x 1080p) with almost all settings turned up (just shadow res on medium instead of high and SS on 1.0)! I'd say that's an incredible huge gain! However, their push with devs in general for implementing such APIs that would be extremely beneficial for their products is ... mediocre at best even though they have the consoles market on their side.

Sadly AMD stand is: hey, we've got this nice stuff, it's free to implement, have fun using it! They forget developers are not mainly driven by passion, but by money and time constrains! When AMD will decide to actually do something about their perceived place in the market (by the consumers), then they can help themselves climb up from the hole they've dug themselves in.

AMD give you better 2D and 3D image quality.
Intel sucked a lot of AMD's money through illegal business practices.
nvidia sucked some of AMD's money through illegal business practices.
 
Back
Top Bottom